00:00:00.001 Started by upstream project "autotest-per-patch" build number 120675 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.016 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.017 The recommended git tool is: git 00:00:00.017 using credential 00000000-0000-0000-0000-000000000002 00:00:00.019 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.036 Fetching changes from the remote Git repository 00:00:00.037 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.061 Using shallow fetch with depth 1 00:00:00.061 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.061 > git --version # timeout=10 00:00:00.107 > git --version # 'git version 2.39.2' 00:00:00.108 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.108 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.108 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:16.570 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:16.586 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:16.601 Checking out Revision a704ed4d86859cb8cbec080c78b138476da6ee34 (FETCH_HEAD) 00:00:16.601 > git config core.sparsecheckout # timeout=10 00:00:16.618 > git read-tree -mu HEAD # timeout=10 00:00:16.641 > git checkout -f a704ed4d86859cb8cbec080c78b138476da6ee34 # timeout=5 00:00:16.663 Commit message: "packer: Insert post-processors only if at least one is defined" 00:00:16.663 > git rev-list --no-walk a704ed4d86859cb8cbec080c78b138476da6ee34 # timeout=10 00:00:16.775 [Pipeline] Start of Pipeline 00:00:16.789 [Pipeline] library 00:00:16.790 Loading library shm_lib@master 00:00:16.791 Library shm_lib@master is cached. Copying from home. 00:00:16.808 [Pipeline] node 00:00:16.832 Running on WFP32 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:16.834 [Pipeline] { 00:00:16.847 [Pipeline] catchError 00:00:16.848 [Pipeline] { 00:00:16.860 [Pipeline] wrap 00:00:16.869 [Pipeline] { 00:00:16.877 [Pipeline] stage 00:00:16.879 [Pipeline] { (Prologue) 00:00:17.040 [Pipeline] sh 00:00:17.867 + logger -p user.info -t JENKINS-CI 00:00:17.889 [Pipeline] echo 00:00:17.891 Node: WFP32 00:00:17.899 [Pipeline] sh 00:00:18.236 [Pipeline] setCustomBuildProperty 00:00:18.250 [Pipeline] echo 00:00:18.251 Cleanup processes 00:00:18.257 [Pipeline] sh 00:00:18.548 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:18.548 82479 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:18.561 [Pipeline] sh 00:00:18.849 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:18.850 ++ grep -v 'sudo pgrep' 00:00:18.850 ++ awk '{print $1}' 00:00:18.850 + sudo kill -9 00:00:18.850 + true 00:00:18.865 [Pipeline] cleanWs 00:00:18.875 [WS-CLEANUP] Deleting project workspace... 00:00:18.875 [WS-CLEANUP] Deferred wipeout is used... 00:00:18.887 [WS-CLEANUP] done 00:00:18.891 [Pipeline] setCustomBuildProperty 00:00:18.905 [Pipeline] sh 00:00:19.190 + sudo git config --global --replace-all safe.directory '*' 00:00:19.269 [Pipeline] nodesByLabel 00:00:19.270 Found a total of 1 nodes with the 'sorcerer' label 00:00:19.280 [Pipeline] httpRequest 00:00:19.561 HttpMethod: GET 00:00:19.562 URL: http://10.211.164.101/packages/jbp_a704ed4d86859cb8cbec080c78b138476da6ee34.tar.gz 00:00:20.376 Sending request to url: http://10.211.164.101/packages/jbp_a704ed4d86859cb8cbec080c78b138476da6ee34.tar.gz 00:00:20.699 Response Code: HTTP/1.1 200 OK 00:00:20.769 Success: Status code 200 is in the accepted range: 200,404 00:00:20.770 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_a704ed4d86859cb8cbec080c78b138476da6ee34.tar.gz 00:00:21.337 [Pipeline] sh 00:00:21.627 + tar --no-same-owner -xf jbp_a704ed4d86859cb8cbec080c78b138476da6ee34.tar.gz 00:00:21.651 [Pipeline] httpRequest 00:00:21.658 HttpMethod: GET 00:00:21.659 URL: http://10.211.164.101/packages/spdk_3381d6e5bc15e7eb55d05cd9262b78557f205689.tar.gz 00:00:21.660 Sending request to url: http://10.211.164.101/packages/spdk_3381d6e5bc15e7eb55d05cd9262b78557f205689.tar.gz 00:00:21.676 Response Code: HTTP/1.1 200 OK 00:00:21.676 Success: Status code 200 is in the accepted range: 200,404 00:00:21.677 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_3381d6e5bc15e7eb55d05cd9262b78557f205689.tar.gz 00:00:36.677 [Pipeline] sh 00:00:36.963 + tar --no-same-owner -xf spdk_3381d6e5bc15e7eb55d05cd9262b78557f205689.tar.gz 00:00:39.524 [Pipeline] sh 00:00:39.824 + git -C spdk log --oneline -n5 00:00:39.824 3381d6e5b nvmf/rpc: fix input validation for nvmf_subsystem_add_listener 00:00:39.824 934164c7a test/nvmf: add missing remove listener discovery 00:00:39.824 38dca48f0 libvfio-user: update submodule to point to `spdk` branch 00:00:39.824 7a71abf69 fuzz/llvm_vfio_fuzz: limit length of generated data to `bytes_per_cmd` 00:00:39.824 fe11fef3a fuzz/llvm_vfio_fuzz: fix `fuzz_vfio_user_irq_set` incorrect data length 00:00:39.837 [Pipeline] } 00:00:39.854 [Pipeline] // stage 00:00:39.863 [Pipeline] stage 00:00:39.865 [Pipeline] { (Prepare) 00:00:39.885 [Pipeline] writeFile 00:00:39.903 [Pipeline] sh 00:00:40.187 + logger -p user.info -t JENKINS-CI 00:00:40.202 [Pipeline] sh 00:00:40.489 + logger -p user.info -t JENKINS-CI 00:00:40.504 [Pipeline] sh 00:00:40.792 + cat autorun-spdk.conf 00:00:40.792 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:40.792 SPDK_TEST_FUZZER_SHORT=1 00:00:40.792 SPDK_TEST_FUZZER=1 00:00:40.792 SPDK_RUN_UBSAN=1 00:00:40.800 RUN_NIGHTLY=0 00:00:40.806 [Pipeline] readFile 00:00:40.853 [Pipeline] withEnv 00:00:40.856 [Pipeline] { 00:00:40.869 [Pipeline] sh 00:00:41.155 + set -ex 00:00:41.155 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:00:41.155 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:41.155 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:41.155 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:41.155 ++ SPDK_TEST_FUZZER=1 00:00:41.155 ++ SPDK_RUN_UBSAN=1 00:00:41.155 ++ RUN_NIGHTLY=0 00:00:41.155 + case $SPDK_TEST_NVMF_NICS in 00:00:41.155 + DRIVERS= 00:00:41.155 + [[ -n '' ]] 00:00:41.155 + exit 0 00:00:41.166 [Pipeline] } 00:00:41.185 [Pipeline] // withEnv 00:00:41.190 [Pipeline] } 00:00:41.242 [Pipeline] // stage 00:00:41.263 [Pipeline] catchError 00:00:41.264 [Pipeline] { 00:00:41.274 [Pipeline] timeout 00:00:41.274 Timeout set to expire in 30 min 00:00:41.276 [Pipeline] { 00:00:41.286 [Pipeline] stage 00:00:41.288 [Pipeline] { (Tests) 00:00:41.301 [Pipeline] sh 00:00:41.583 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:41.583 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:41.583 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:00:41.583 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:00:41.583 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:41.583 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:41.583 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:00:41.583 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:41.583 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:41.583 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:41.583 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:41.583 + source /etc/os-release 00:00:41.583 ++ NAME='Fedora Linux' 00:00:41.583 ++ VERSION='38 (Cloud Edition)' 00:00:41.583 ++ ID=fedora 00:00:41.583 ++ VERSION_ID=38 00:00:41.583 ++ VERSION_CODENAME= 00:00:41.583 ++ PLATFORM_ID=platform:f38 00:00:41.583 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:41.583 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:41.583 ++ LOGO=fedora-logo-icon 00:00:41.583 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:41.583 ++ HOME_URL=https://fedoraproject.org/ 00:00:41.583 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:41.583 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:41.583 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:41.583 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:41.583 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:41.583 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:41.583 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:41.583 ++ SUPPORT_END=2024-05-14 00:00:41.583 ++ VARIANT='Cloud Edition' 00:00:41.583 ++ VARIANT_ID=cloud 00:00:41.583 + uname -a 00:00:41.583 Linux spdk-wfp-32 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:41.583 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:00:44.876 Hugepages 00:00:44.876 node hugesize free / total 00:00:44.876 node0 1048576kB 0 / 0 00:00:44.876 node0 2048kB 0 / 0 00:00:44.876 node1 1048576kB 0 / 0 00:00:44.876 node1 2048kB 0 / 0 00:00:44.876 00:00:44.876 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:44.876 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:44.876 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:44.876 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:44.876 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:44.876 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:44.876 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:44.876 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:44.876 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:44.876 NVMe 0000:5e:00.0 144d a80a 0 nvme nvme0 nvme0n1 00:00:44.876 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:44.876 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:44.876 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:44.876 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:44.876 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:44.876 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:44.876 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:44.876 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:44.876 NVMe 0000:af:00.0 8086 2701 1 nvme nvme1 nvme1n1 00:00:44.876 NVMe 0000:b0:00.0 8086 4140 1 nvme nvme2 nvme2n1 00:00:44.876 + rm -f /tmp/spdk-ld-path 00:00:44.876 + source autorun-spdk.conf 00:00:44.876 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:44.876 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:44.876 ++ SPDK_TEST_FUZZER=1 00:00:44.876 ++ SPDK_RUN_UBSAN=1 00:00:44.876 ++ RUN_NIGHTLY=0 00:00:44.876 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:44.876 + [[ -n '' ]] 00:00:44.877 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:44.877 + for M in /var/spdk/build-*-manifest.txt 00:00:44.877 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:44.877 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:44.877 + for M in /var/spdk/build-*-manifest.txt 00:00:44.877 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:44.877 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:44.877 ++ uname 00:00:44.877 + [[ Linux == \L\i\n\u\x ]] 00:00:44.877 + sudo dmesg -T 00:00:44.877 + sudo dmesg --clear 00:00:44.877 + dmesg_pid=83366 00:00:44.877 + [[ Fedora Linux == FreeBSD ]] 00:00:44.877 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:44.877 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:44.877 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:44.877 + sudo dmesg -Tw 00:00:44.877 + [[ -x /usr/src/fio-static/fio ]] 00:00:44.877 + export FIO_BIN=/usr/src/fio-static/fio 00:00:44.877 + FIO_BIN=/usr/src/fio-static/fio 00:00:44.877 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:44.877 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:44.877 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:44.877 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:44.877 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:44.877 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:44.877 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:44.877 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:44.877 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:44.877 Test configuration: 00:00:44.877 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:44.877 SPDK_TEST_FUZZER_SHORT=1 00:00:44.877 SPDK_TEST_FUZZER=1 00:00:44.877 SPDK_RUN_UBSAN=1 00:00:44.877 RUN_NIGHTLY=0 10:19:06 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:00:44.877 10:19:06 -- scripts/common.sh@502 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:44.877 10:19:06 -- scripts/common.sh@510 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:44.877 10:19:06 -- scripts/common.sh@511 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:44.877 10:19:06 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:44.877 10:19:06 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:44.877 10:19:06 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:44.877 10:19:06 -- paths/export.sh@5 -- $ export PATH 00:00:44.877 10:19:06 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:44.877 10:19:06 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:00:44.877 10:19:06 -- common/autobuild_common.sh@435 -- $ date +%s 00:00:44.877 10:19:06 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1713514746.XXXXXX 00:00:44.877 10:19:06 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1713514746.aoATxS 00:00:44.877 10:19:06 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:00:44.877 10:19:06 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:00:44.877 10:19:06 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:00:44.877 10:19:06 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:44.877 10:19:06 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:44.877 10:19:06 -- common/autobuild_common.sh@451 -- $ get_config_params 00:00:44.877 10:19:06 -- common/autotest_common.sh@385 -- $ xtrace_disable 00:00:44.877 10:19:06 -- common/autotest_common.sh@10 -- $ set +x 00:00:45.137 10:19:06 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:00:45.137 10:19:06 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:00:45.137 10:19:06 -- pm/common@17 -- $ local monitor 00:00:45.137 10:19:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:45.137 10:19:06 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=83402 00:00:45.137 10:19:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:45.137 10:19:06 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=83404 00:00:45.137 10:19:06 -- pm/common@21 -- $ date +%s 00:00:45.137 10:19:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:45.137 10:19:06 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=83406 00:00:45.137 10:19:06 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:45.137 10:19:06 -- pm/common@21 -- $ date +%s 00:00:45.137 10:19:06 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=83409 00:00:45.137 10:19:06 -- pm/common@21 -- $ date +%s 00:00:45.137 10:19:06 -- pm/common@26 -- $ sleep 1 00:00:45.137 10:19:06 -- pm/common@21 -- $ date +%s 00:00:45.137 10:19:06 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713514746 00:00:45.137 10:19:06 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713514746 00:00:45.137 10:19:07 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713514746 00:00:45.137 10:19:07 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713514747 00:00:45.137 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713514746_collect-vmstat.pm.log 00:00:45.137 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713514747_collect-bmc-pm.bmc.pm.log 00:00:45.137 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713514746_collect-cpu-load.pm.log 00:00:45.137 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713514746_collect-cpu-temp.pm.log 00:00:46.077 10:19:08 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:00:46.077 10:19:08 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:46.077 10:19:08 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:46.077 10:19:08 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:46.077 10:19:08 -- spdk/autobuild.sh@16 -- $ date -u 00:00:46.077 Fri Apr 19 08:19:08 AM UTC 2024 00:00:46.077 10:19:08 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:46.077 v24.05-pre-412-g3381d6e5b 00:00:46.077 10:19:08 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:46.077 10:19:08 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:46.077 10:19:08 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:46.077 10:19:08 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:00:46.077 10:19:08 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:46.077 10:19:08 -- common/autotest_common.sh@10 -- $ set +x 00:00:46.336 ************************************ 00:00:46.336 START TEST ubsan 00:00:46.336 ************************************ 00:00:46.336 10:19:08 -- common/autotest_common.sh@1111 -- $ echo 'using ubsan' 00:00:46.336 using ubsan 00:00:46.336 00:00:46.336 real 0m0.001s 00:00:46.336 user 0m0.000s 00:00:46.336 sys 0m0.000s 00:00:46.336 10:19:08 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:00:46.336 10:19:08 -- common/autotest_common.sh@10 -- $ set +x 00:00:46.336 ************************************ 00:00:46.336 END TEST ubsan 00:00:46.336 ************************************ 00:00:46.336 10:19:08 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:46.336 10:19:08 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:46.336 10:19:08 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:46.336 10:19:08 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:00:46.336 10:19:08 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:00:46.336 10:19:08 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:00:46.336 10:19:08 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:00:46.336 10:19:08 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:46.336 10:19:08 -- common/autotest_common.sh@10 -- $ set +x 00:00:46.336 ************************************ 00:00:46.336 START TEST autobuild_llvm_precompile 00:00:46.336 ************************************ 00:00:46.336 10:19:08 -- common/autotest_common.sh@1111 -- $ _llvm_precompile 00:00:46.336 10:19:08 -- common/autobuild_common.sh@32 -- $ clang --version 00:00:48.241 10:19:10 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:00:48.241 Target: x86_64-redhat-linux-gnu 00:00:48.241 Thread model: posix 00:00:48.241 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:00:48.241 10:19:10 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:00:48.241 10:19:10 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:00:48.241 10:19:10 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:00:48.241 10:19:10 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:00:48.241 10:19:10 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:00:48.241 10:19:10 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:00:48.241 10:19:10 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:00:48.241 10:19:10 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:00:48.241 10:19:10 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:00:48.241 10:19:10 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:00:48.809 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:00:48.809 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:00:49.748 Using 'verbs' RDMA provider 00:01:06.016 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:20.912 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:20.912 Creating mk/config.mk...done. 00:01:20.912 Creating mk/cc.flags.mk...done. 00:01:20.912 Type 'make' to build. 00:01:20.912 00:01:20.912 real 0m33.535s 00:01:20.912 user 0m12.932s 00:01:20.912 sys 0m20.195s 00:01:20.912 10:19:41 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:01:20.912 10:19:41 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.912 ************************************ 00:01:20.912 END TEST autobuild_llvm_precompile 00:01:20.912 ************************************ 00:01:20.912 10:19:41 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:20.912 10:19:41 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:20.912 10:19:41 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:20.912 10:19:41 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:20.912 10:19:41 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:20.912 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:20.912 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:20.912 Using 'verbs' RDMA provider 00:01:34.067 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:46.288 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:46.288 Creating mk/config.mk...done. 00:01:46.288 Creating mk/cc.flags.mk...done. 00:01:46.288 Type 'make' to build. 00:01:46.288 10:20:07 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:46.288 10:20:07 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:46.288 10:20:07 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:46.288 10:20:07 -- common/autotest_common.sh@10 -- $ set +x 00:01:46.288 ************************************ 00:01:46.288 START TEST make 00:01:46.288 ************************************ 00:01:46.288 10:20:07 -- common/autotest_common.sh@1111 -- $ make -j72 00:01:46.288 make[1]: Nothing to be done for 'all'. 00:01:48.832 The Meson build system 00:01:48.832 Version: 1.3.1 00:01:48.832 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:01:48.832 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:48.832 Build type: native build 00:01:48.832 Project name: libvfio-user 00:01:48.832 Project version: 0.0.1 00:01:48.832 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:01:48.832 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:01:48.832 Host machine cpu family: x86_64 00:01:48.832 Host machine cpu: x86_64 00:01:48.832 Run-time dependency threads found: YES 00:01:48.832 Library dl found: YES 00:01:48.832 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:48.832 Run-time dependency json-c found: YES 0.17 00:01:48.832 Run-time dependency cmocka found: YES 1.1.7 00:01:48.832 Program pytest-3 found: NO 00:01:48.832 Program flake8 found: NO 00:01:48.832 Program misspell-fixer found: NO 00:01:48.832 Program restructuredtext-lint found: NO 00:01:48.832 Program valgrind found: YES (/usr/bin/valgrind) 00:01:48.832 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:48.832 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:48.832 Compiler for C supports arguments -Wwrite-strings: YES 00:01:48.832 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:48.832 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:01:48.832 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:01:48.832 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:48.832 Build targets in project: 8 00:01:48.832 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:01:48.832 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:01:48.832 00:01:48.832 libvfio-user 0.0.1 00:01:48.832 00:01:48.832 User defined options 00:01:48.832 buildtype : debug 00:01:48.832 default_library: static 00:01:48.832 libdir : /usr/local/lib 00:01:48.832 00:01:48.832 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:48.832 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:48.832 [1/36] Compiling C object samples/null.p/null.c.o 00:01:48.832 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:01:48.832 [3/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:01:48.832 [4/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:01:48.832 [5/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:01:48.832 [6/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:01:48.832 [7/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:01:48.832 [8/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:01:48.832 [9/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:01:48.832 [10/36] Compiling C object test/unit_tests.p/mocks.c.o 00:01:48.832 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:01:48.832 [12/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:01:48.832 [13/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:01:48.832 [14/36] Compiling C object samples/server.p/server.c.o 00:01:48.832 [15/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:01:48.832 [16/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:01:48.832 [17/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:01:48.832 [18/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:01:48.832 [19/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:01:48.832 [20/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:01:48.832 [21/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:01:48.832 [22/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:01:48.832 [23/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:01:48.832 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:01:48.832 [25/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:01:48.832 [26/36] Compiling C object samples/client.p/client.c.o 00:01:48.832 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:01:48.832 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:01:49.093 [29/36] Linking target samples/client 00:01:49.093 [30/36] Linking static target lib/libvfio-user.a 00:01:49.093 [31/36] Linking target test/unit_tests 00:01:49.093 [32/36] Linking target samples/lspci 00:01:49.093 [33/36] Linking target samples/server 00:01:49.093 [34/36] Linking target samples/shadow_ioeventfd_server 00:01:49.093 [35/36] Linking target samples/null 00:01:49.093 [36/36] Linking target samples/gpio-pci-idio-16 00:01:49.093 INFO: autodetecting backend as ninja 00:01:49.093 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:49.093 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:49.354 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:49.354 ninja: no work to do. 00:01:54.637 The Meson build system 00:01:54.637 Version: 1.3.1 00:01:54.637 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:01:54.637 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:01:54.637 Build type: native build 00:01:54.637 Program cat found: YES (/usr/bin/cat) 00:01:54.637 Project name: DPDK 00:01:54.637 Project version: 23.11.0 00:01:54.637 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:01:54.637 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:01:54.637 Host machine cpu family: x86_64 00:01:54.637 Host machine cpu: x86_64 00:01:54.637 Message: ## Building in Developer Mode ## 00:01:54.637 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:54.637 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:54.637 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:54.637 Program python3 found: YES (/usr/bin/python3) 00:01:54.637 Program cat found: YES (/usr/bin/cat) 00:01:54.637 Compiler for C supports arguments -march=native: YES 00:01:54.637 Checking for size of "void *" : 8 00:01:54.637 Checking for size of "void *" : 8 (cached) 00:01:54.637 Library m found: YES 00:01:54.637 Library numa found: YES 00:01:54.637 Has header "numaif.h" : YES 00:01:54.637 Library fdt found: NO 00:01:54.637 Library execinfo found: NO 00:01:54.637 Has header "execinfo.h" : YES 00:01:54.637 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:54.637 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:54.637 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:54.637 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:54.637 Run-time dependency openssl found: YES 3.0.9 00:01:54.637 Run-time dependency libpcap found: YES 1.10.4 00:01:54.637 Has header "pcap.h" with dependency libpcap: YES 00:01:54.637 Compiler for C supports arguments -Wcast-qual: YES 00:01:54.637 Compiler for C supports arguments -Wdeprecated: YES 00:01:54.637 Compiler for C supports arguments -Wformat: YES 00:01:54.637 Compiler for C supports arguments -Wformat-nonliteral: YES 00:01:54.637 Compiler for C supports arguments -Wformat-security: YES 00:01:54.637 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:54.637 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:54.637 Compiler for C supports arguments -Wnested-externs: YES 00:01:54.637 Compiler for C supports arguments -Wold-style-definition: YES 00:01:54.637 Compiler for C supports arguments -Wpointer-arith: YES 00:01:54.637 Compiler for C supports arguments -Wsign-compare: YES 00:01:54.637 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:54.637 Compiler for C supports arguments -Wundef: YES 00:01:54.637 Compiler for C supports arguments -Wwrite-strings: YES 00:01:54.637 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:54.637 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:01:54.637 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:54.637 Program objdump found: YES (/usr/bin/objdump) 00:01:54.637 Compiler for C supports arguments -mavx512f: YES 00:01:54.637 Checking if "AVX512 checking" compiles: YES 00:01:54.637 Fetching value of define "__SSE4_2__" : 1 00:01:54.637 Fetching value of define "__AES__" : 1 00:01:54.637 Fetching value of define "__AVX__" : 1 00:01:54.637 Fetching value of define "__AVX2__" : 1 00:01:54.637 Fetching value of define "__AVX512BW__" : 1 00:01:54.637 Fetching value of define "__AVX512CD__" : 1 00:01:54.637 Fetching value of define "__AVX512DQ__" : 1 00:01:54.637 Fetching value of define "__AVX512F__" : 1 00:01:54.637 Fetching value of define "__AVX512VL__" : 1 00:01:54.637 Fetching value of define "__PCLMUL__" : 1 00:01:54.637 Fetching value of define "__RDRND__" : 1 00:01:54.638 Fetching value of define "__RDSEED__" : 1 00:01:54.638 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:54.638 Fetching value of define "__znver1__" : (undefined) 00:01:54.638 Fetching value of define "__znver2__" : (undefined) 00:01:54.638 Fetching value of define "__znver3__" : (undefined) 00:01:54.638 Fetching value of define "__znver4__" : (undefined) 00:01:54.638 Compiler for C supports arguments -Wno-format-truncation: NO 00:01:54.638 Message: lib/log: Defining dependency "log" 00:01:54.638 Message: lib/kvargs: Defining dependency "kvargs" 00:01:54.638 Message: lib/telemetry: Defining dependency "telemetry" 00:01:54.638 Checking for function "getentropy" : NO 00:01:54.638 Message: lib/eal: Defining dependency "eal" 00:01:54.638 Message: lib/ring: Defining dependency "ring" 00:01:54.638 Message: lib/rcu: Defining dependency "rcu" 00:01:54.638 Message: lib/mempool: Defining dependency "mempool" 00:01:54.638 Message: lib/mbuf: Defining dependency "mbuf" 00:01:54.638 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:54.638 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:54.638 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:54.638 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:54.638 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:54.638 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:54.638 Compiler for C supports arguments -mpclmul: YES 00:01:54.638 Compiler for C supports arguments -maes: YES 00:01:54.638 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:54.638 Compiler for C supports arguments -mavx512bw: YES 00:01:54.638 Compiler for C supports arguments -mavx512dq: YES 00:01:54.638 Compiler for C supports arguments -mavx512vl: YES 00:01:54.638 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:54.638 Compiler for C supports arguments -mavx2: YES 00:01:54.638 Compiler for C supports arguments -mavx: YES 00:01:54.638 Message: lib/net: Defining dependency "net" 00:01:54.638 Message: lib/meter: Defining dependency "meter" 00:01:54.638 Message: lib/ethdev: Defining dependency "ethdev" 00:01:54.638 Message: lib/pci: Defining dependency "pci" 00:01:54.638 Message: lib/cmdline: Defining dependency "cmdline" 00:01:54.638 Message: lib/hash: Defining dependency "hash" 00:01:54.638 Message: lib/timer: Defining dependency "timer" 00:01:54.638 Message: lib/compressdev: Defining dependency "compressdev" 00:01:54.638 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:54.638 Message: lib/dmadev: Defining dependency "dmadev" 00:01:54.638 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:54.638 Message: lib/power: Defining dependency "power" 00:01:54.638 Message: lib/reorder: Defining dependency "reorder" 00:01:54.638 Message: lib/security: Defining dependency "security" 00:01:54.638 Has header "linux/userfaultfd.h" : YES 00:01:54.638 Has header "linux/vduse.h" : YES 00:01:54.638 Message: lib/vhost: Defining dependency "vhost" 00:01:54.638 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:01:54.638 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:54.638 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:54.638 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:54.638 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:54.638 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:54.638 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:54.638 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:54.638 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:54.638 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:54.638 Program doxygen found: YES (/usr/bin/doxygen) 00:01:54.638 Configuring doxy-api-html.conf using configuration 00:01:54.638 Configuring doxy-api-man.conf using configuration 00:01:54.638 Program mandb found: YES (/usr/bin/mandb) 00:01:54.638 Program sphinx-build found: NO 00:01:54.638 Configuring rte_build_config.h using configuration 00:01:54.638 Message: 00:01:54.638 ================= 00:01:54.638 Applications Enabled 00:01:54.638 ================= 00:01:54.638 00:01:54.638 apps: 00:01:54.638 00:01:54.638 00:01:54.638 Message: 00:01:54.638 ================= 00:01:54.638 Libraries Enabled 00:01:54.638 ================= 00:01:54.638 00:01:54.638 libs: 00:01:54.638 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:54.638 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:54.638 cryptodev, dmadev, power, reorder, security, vhost, 00:01:54.638 00:01:54.638 Message: 00:01:54.638 =============== 00:01:54.638 Drivers Enabled 00:01:54.638 =============== 00:01:54.638 00:01:54.638 common: 00:01:54.638 00:01:54.638 bus: 00:01:54.638 pci, vdev, 00:01:54.638 mempool: 00:01:54.638 ring, 00:01:54.638 dma: 00:01:54.638 00:01:54.638 net: 00:01:54.638 00:01:54.638 crypto: 00:01:54.638 00:01:54.638 compress: 00:01:54.638 00:01:54.638 vdpa: 00:01:54.638 00:01:54.638 00:01:54.638 Message: 00:01:54.638 ================= 00:01:54.638 Content Skipped 00:01:54.638 ================= 00:01:54.638 00:01:54.638 apps: 00:01:54.638 dumpcap: explicitly disabled via build config 00:01:54.638 graph: explicitly disabled via build config 00:01:54.638 pdump: explicitly disabled via build config 00:01:54.638 proc-info: explicitly disabled via build config 00:01:54.638 test-acl: explicitly disabled via build config 00:01:54.638 test-bbdev: explicitly disabled via build config 00:01:54.638 test-cmdline: explicitly disabled via build config 00:01:54.638 test-compress-perf: explicitly disabled via build config 00:01:54.638 test-crypto-perf: explicitly disabled via build config 00:01:54.638 test-dma-perf: explicitly disabled via build config 00:01:54.638 test-eventdev: explicitly disabled via build config 00:01:54.638 test-fib: explicitly disabled via build config 00:01:54.638 test-flow-perf: explicitly disabled via build config 00:01:54.638 test-gpudev: explicitly disabled via build config 00:01:54.638 test-mldev: explicitly disabled via build config 00:01:54.638 test-pipeline: explicitly disabled via build config 00:01:54.638 test-pmd: explicitly disabled via build config 00:01:54.638 test-regex: explicitly disabled via build config 00:01:54.638 test-sad: explicitly disabled via build config 00:01:54.638 test-security-perf: explicitly disabled via build config 00:01:54.638 00:01:54.638 libs: 00:01:54.638 metrics: explicitly disabled via build config 00:01:54.638 acl: explicitly disabled via build config 00:01:54.638 bbdev: explicitly disabled via build config 00:01:54.638 bitratestats: explicitly disabled via build config 00:01:54.638 bpf: explicitly disabled via build config 00:01:54.638 cfgfile: explicitly disabled via build config 00:01:54.638 distributor: explicitly disabled via build config 00:01:54.638 efd: explicitly disabled via build config 00:01:54.638 eventdev: explicitly disabled via build config 00:01:54.638 dispatcher: explicitly disabled via build config 00:01:54.638 gpudev: explicitly disabled via build config 00:01:54.638 gro: explicitly disabled via build config 00:01:54.638 gso: explicitly disabled via build config 00:01:54.638 ip_frag: explicitly disabled via build config 00:01:54.638 jobstats: explicitly disabled via build config 00:01:54.638 latencystats: explicitly disabled via build config 00:01:54.638 lpm: explicitly disabled via build config 00:01:54.638 member: explicitly disabled via build config 00:01:54.638 pcapng: explicitly disabled via build config 00:01:54.638 rawdev: explicitly disabled via build config 00:01:54.638 regexdev: explicitly disabled via build config 00:01:54.638 mldev: explicitly disabled via build config 00:01:54.638 rib: explicitly disabled via build config 00:01:54.638 sched: explicitly disabled via build config 00:01:54.638 stack: explicitly disabled via build config 00:01:54.638 ipsec: explicitly disabled via build config 00:01:54.638 pdcp: explicitly disabled via build config 00:01:54.638 fib: explicitly disabled via build config 00:01:54.638 port: explicitly disabled via build config 00:01:54.638 pdump: explicitly disabled via build config 00:01:54.638 table: explicitly disabled via build config 00:01:54.638 pipeline: explicitly disabled via build config 00:01:54.638 graph: explicitly disabled via build config 00:01:54.638 node: explicitly disabled via build config 00:01:54.638 00:01:54.638 drivers: 00:01:54.638 common/cpt: not in enabled drivers build config 00:01:54.638 common/dpaax: not in enabled drivers build config 00:01:54.638 common/iavf: not in enabled drivers build config 00:01:54.638 common/idpf: not in enabled drivers build config 00:01:54.638 common/mvep: not in enabled drivers build config 00:01:54.638 common/octeontx: not in enabled drivers build config 00:01:54.638 bus/auxiliary: not in enabled drivers build config 00:01:54.638 bus/cdx: not in enabled drivers build config 00:01:54.638 bus/dpaa: not in enabled drivers build config 00:01:54.638 bus/fslmc: not in enabled drivers build config 00:01:54.638 bus/ifpga: not in enabled drivers build config 00:01:54.638 bus/platform: not in enabled drivers build config 00:01:54.638 bus/vmbus: not in enabled drivers build config 00:01:54.638 common/cnxk: not in enabled drivers build config 00:01:54.638 common/mlx5: not in enabled drivers build config 00:01:54.638 common/nfp: not in enabled drivers build config 00:01:54.638 common/qat: not in enabled drivers build config 00:01:54.638 common/sfc_efx: not in enabled drivers build config 00:01:54.638 mempool/bucket: not in enabled drivers build config 00:01:54.638 mempool/cnxk: not in enabled drivers build config 00:01:54.639 mempool/dpaa: not in enabled drivers build config 00:01:54.639 mempool/dpaa2: not in enabled drivers build config 00:01:54.639 mempool/octeontx: not in enabled drivers build config 00:01:54.639 mempool/stack: not in enabled drivers build config 00:01:54.639 dma/cnxk: not in enabled drivers build config 00:01:54.639 dma/dpaa: not in enabled drivers build config 00:01:54.639 dma/dpaa2: not in enabled drivers build config 00:01:54.639 dma/hisilicon: not in enabled drivers build config 00:01:54.639 dma/idxd: not in enabled drivers build config 00:01:54.639 dma/ioat: not in enabled drivers build config 00:01:54.639 dma/skeleton: not in enabled drivers build config 00:01:54.639 net/af_packet: not in enabled drivers build config 00:01:54.639 net/af_xdp: not in enabled drivers build config 00:01:54.639 net/ark: not in enabled drivers build config 00:01:54.639 net/atlantic: not in enabled drivers build config 00:01:54.639 net/avp: not in enabled drivers build config 00:01:54.639 net/axgbe: not in enabled drivers build config 00:01:54.639 net/bnx2x: not in enabled drivers build config 00:01:54.639 net/bnxt: not in enabled drivers build config 00:01:54.639 net/bonding: not in enabled drivers build config 00:01:54.639 net/cnxk: not in enabled drivers build config 00:01:54.639 net/cpfl: not in enabled drivers build config 00:01:54.639 net/cxgbe: not in enabled drivers build config 00:01:54.639 net/dpaa: not in enabled drivers build config 00:01:54.639 net/dpaa2: not in enabled drivers build config 00:01:54.639 net/e1000: not in enabled drivers build config 00:01:54.639 net/ena: not in enabled drivers build config 00:01:54.639 net/enetc: not in enabled drivers build config 00:01:54.639 net/enetfec: not in enabled drivers build config 00:01:54.639 net/enic: not in enabled drivers build config 00:01:54.639 net/failsafe: not in enabled drivers build config 00:01:54.639 net/fm10k: not in enabled drivers build config 00:01:54.639 net/gve: not in enabled drivers build config 00:01:54.639 net/hinic: not in enabled drivers build config 00:01:54.639 net/hns3: not in enabled drivers build config 00:01:54.639 net/i40e: not in enabled drivers build config 00:01:54.639 net/iavf: not in enabled drivers build config 00:01:54.639 net/ice: not in enabled drivers build config 00:01:54.639 net/idpf: not in enabled drivers build config 00:01:54.639 net/igc: not in enabled drivers build config 00:01:54.639 net/ionic: not in enabled drivers build config 00:01:54.639 net/ipn3ke: not in enabled drivers build config 00:01:54.639 net/ixgbe: not in enabled drivers build config 00:01:54.639 net/mana: not in enabled drivers build config 00:01:54.639 net/memif: not in enabled drivers build config 00:01:54.639 net/mlx4: not in enabled drivers build config 00:01:54.639 net/mlx5: not in enabled drivers build config 00:01:54.639 net/mvneta: not in enabled drivers build config 00:01:54.639 net/mvpp2: not in enabled drivers build config 00:01:54.639 net/netvsc: not in enabled drivers build config 00:01:54.639 net/nfb: not in enabled drivers build config 00:01:54.639 net/nfp: not in enabled drivers build config 00:01:54.639 net/ngbe: not in enabled drivers build config 00:01:54.639 net/null: not in enabled drivers build config 00:01:54.639 net/octeontx: not in enabled drivers build config 00:01:54.639 net/octeon_ep: not in enabled drivers build config 00:01:54.639 net/pcap: not in enabled drivers build config 00:01:54.639 net/pfe: not in enabled drivers build config 00:01:54.639 net/qede: not in enabled drivers build config 00:01:54.639 net/ring: not in enabled drivers build config 00:01:54.639 net/sfc: not in enabled drivers build config 00:01:54.639 net/softnic: not in enabled drivers build config 00:01:54.639 net/tap: not in enabled drivers build config 00:01:54.639 net/thunderx: not in enabled drivers build config 00:01:54.639 net/txgbe: not in enabled drivers build config 00:01:54.639 net/vdev_netvsc: not in enabled drivers build config 00:01:54.639 net/vhost: not in enabled drivers build config 00:01:54.639 net/virtio: not in enabled drivers build config 00:01:54.639 net/vmxnet3: not in enabled drivers build config 00:01:54.639 raw/*: missing internal dependency, "rawdev" 00:01:54.639 crypto/armv8: not in enabled drivers build config 00:01:54.639 crypto/bcmfs: not in enabled drivers build config 00:01:54.639 crypto/caam_jr: not in enabled drivers build config 00:01:54.639 crypto/ccp: not in enabled drivers build config 00:01:54.639 crypto/cnxk: not in enabled drivers build config 00:01:54.639 crypto/dpaa_sec: not in enabled drivers build config 00:01:54.639 crypto/dpaa2_sec: not in enabled drivers build config 00:01:54.639 crypto/ipsec_mb: not in enabled drivers build config 00:01:54.639 crypto/mlx5: not in enabled drivers build config 00:01:54.639 crypto/mvsam: not in enabled drivers build config 00:01:54.639 crypto/nitrox: not in enabled drivers build config 00:01:54.639 crypto/null: not in enabled drivers build config 00:01:54.639 crypto/octeontx: not in enabled drivers build config 00:01:54.639 crypto/openssl: not in enabled drivers build config 00:01:54.639 crypto/scheduler: not in enabled drivers build config 00:01:54.639 crypto/uadk: not in enabled drivers build config 00:01:54.639 crypto/virtio: not in enabled drivers build config 00:01:54.639 compress/isal: not in enabled drivers build config 00:01:54.639 compress/mlx5: not in enabled drivers build config 00:01:54.639 compress/octeontx: not in enabled drivers build config 00:01:54.639 compress/zlib: not in enabled drivers build config 00:01:54.639 regex/*: missing internal dependency, "regexdev" 00:01:54.639 ml/*: missing internal dependency, "mldev" 00:01:54.639 vdpa/ifc: not in enabled drivers build config 00:01:54.639 vdpa/mlx5: not in enabled drivers build config 00:01:54.639 vdpa/nfp: not in enabled drivers build config 00:01:54.639 vdpa/sfc: not in enabled drivers build config 00:01:54.639 event/*: missing internal dependency, "eventdev" 00:01:54.639 baseband/*: missing internal dependency, "bbdev" 00:01:54.639 gpu/*: missing internal dependency, "gpudev" 00:01:54.639 00:01:54.639 00:01:54.899 Build targets in project: 85 00:01:54.899 00:01:54.899 DPDK 23.11.0 00:01:54.899 00:01:54.899 User defined options 00:01:54.899 buildtype : debug 00:01:54.899 default_library : static 00:01:54.899 libdir : lib 00:01:54.899 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:54.899 c_args : -fPIC -Werror 00:01:54.899 c_link_args : 00:01:54.899 cpu_instruction_set: native 00:01:54.899 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:01:54.899 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,pipeline,bbdev,table,metrics,member,jobstats,efd,rib 00:01:54.899 enable_docs : false 00:01:54.899 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:01:54.899 enable_kmods : false 00:01:54.899 tests : false 00:01:54.899 00:01:54.899 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:55.168 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:01:55.168 [1/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:55.168 [2/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:55.168 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:55.168 [4/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:55.168 [5/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:55.168 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:55.168 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:55.168 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:55.430 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:55.430 [10/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:55.430 [11/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:55.431 [12/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:55.431 [13/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:55.431 [14/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:55.431 [15/265] Linking static target lib/librte_kvargs.a 00:01:55.431 [16/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:55.431 [17/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:55.431 [18/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:55.431 [19/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:55.431 [20/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:55.431 [21/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:55.431 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:55.431 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:55.431 [24/265] Linking static target lib/librte_log.a 00:01:55.431 [25/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:55.695 [26/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.695 [27/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:55.695 [28/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:55.695 [29/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:55.695 [30/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:55.695 [31/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:55.695 [32/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:55.695 [33/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:55.695 [34/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:55.695 [35/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:55.695 [36/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:55.695 [37/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:55.695 [38/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:55.695 [39/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:55.695 [40/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:55.695 [41/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:55.695 [42/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:55.695 [43/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:55.695 [44/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:55.695 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:55.695 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:55.695 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:55.695 [48/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:55.695 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:55.958 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:55.958 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:55.958 [52/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:55.958 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:55.958 [54/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:55.958 [55/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:55.958 [56/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:55.958 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:55.958 [58/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:55.958 [59/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:55.958 [60/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:55.958 [61/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:55.958 [62/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:55.958 [63/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:55.958 [64/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:55.958 [65/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:55.958 [66/265] Linking static target lib/librte_telemetry.a 00:01:55.958 [67/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:55.958 [68/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:55.958 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:55.959 [70/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:55.959 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:55.959 [72/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:55.959 [73/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:55.959 [74/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:55.959 [75/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:55.959 [76/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:55.959 [77/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:55.959 [78/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:55.959 [79/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:55.959 [80/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:55.959 [81/265] Linking static target lib/librte_ring.a 00:01:55.959 [82/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:55.959 [83/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:55.959 [84/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:55.959 [85/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:55.959 [86/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:55.959 [87/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:55.959 [88/265] Linking static target lib/librte_pci.a 00:01:55.959 [89/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:55.959 [90/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:55.959 [91/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:55.959 [92/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:55.959 [93/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:55.959 [94/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:55.959 [95/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:55.959 [96/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:55.959 [97/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:55.959 [98/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:55.959 [99/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:55.959 [100/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:55.959 [101/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:55.959 [102/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:55.959 [103/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:55.959 [104/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:55.959 [105/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:55.959 [106/265] Linking static target lib/librte_meter.a 00:01:55.959 [107/265] Linking static target lib/librte_eal.a 00:01:55.959 [108/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:55.959 [109/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:55.959 [110/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.959 [111/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:55.959 [112/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:55.959 [113/265] Linking static target lib/librte_rcu.a 00:01:55.959 [114/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:55.959 [115/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:55.959 [116/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:55.959 [117/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:55.959 [118/265] Linking static target lib/librte_mempool.a 00:01:55.959 [119/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:55.959 [120/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:55.959 [121/265] Linking static target lib/librte_net.a 00:01:56.218 [122/265] Linking target lib/librte_log.so.24.0 00:01:56.218 [123/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:56.218 [124/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.218 [125/265] Linking static target lib/librte_mbuf.a 00:01:56.218 [126/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.218 [127/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.218 [128/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:56.218 [129/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:56.218 [130/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:56.478 [131/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.478 [132/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.478 [133/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:56.478 [134/265] Linking target lib/librte_kvargs.so.24.0 00:01:56.478 [135/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.478 [136/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:56.478 [137/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:56.478 [138/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:56.478 [139/265] Linking target lib/librte_telemetry.so.24.0 00:01:56.478 [140/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:56.478 [141/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:56.478 [142/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:56.478 [143/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:56.478 [144/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:56.478 [145/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:56.478 [146/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:56.478 [147/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:56.478 [148/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:56.478 [149/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:56.478 [150/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:56.478 [151/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:56.478 [152/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:56.478 [153/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:56.478 [154/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:56.478 [155/265] Linking static target lib/librte_timer.a 00:01:56.478 [156/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:56.478 [157/265] Linking static target lib/librte_cmdline.a 00:01:56.478 [158/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:56.478 [159/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:56.478 [160/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:56.478 [161/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:56.478 [162/265] Linking static target lib/librte_reorder.a 00:01:56.478 [163/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:56.478 [164/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:56.478 [165/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:56.478 [166/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:56.478 [167/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:56.478 [168/265] Linking static target lib/librte_dmadev.a 00:01:56.478 [169/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:56.478 [170/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:56.478 [171/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:56.478 [172/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:56.478 [173/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:56.478 [174/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:56.478 [175/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:56.478 [176/265] Linking static target lib/librte_compressdev.a 00:01:56.478 [177/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:56.478 [178/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:56.478 [179/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:56.478 [180/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:56.478 [181/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:56.737 [182/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:56.737 [183/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:56.737 [184/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:56.737 [185/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:56.737 [186/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:56.737 [187/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:56.737 [188/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:56.737 [189/265] Linking static target lib/librte_security.a 00:01:56.737 [190/265] Linking static target lib/librte_power.a 00:01:56.737 [191/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:56.737 [192/265] Linking static target lib/librte_hash.a 00:01:56.737 [193/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:56.737 [194/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:56.737 [195/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.737 [196/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:56.737 [197/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:56.737 [198/265] Linking static target lib/librte_cryptodev.a 00:01:56.737 [199/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:56.998 [200/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:56.998 [201/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:56.998 [202/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:56.998 [203/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:56.998 [204/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.999 [205/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:56.999 [206/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:56.999 [207/265] Linking static target drivers/librte_mempool_ring.a 00:01:56.999 [208/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:56.999 [209/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:56.999 [210/265] Linking static target drivers/librte_bus_vdev.a 00:01:56.999 [211/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.999 [212/265] Linking static target drivers/librte_bus_pci.a 00:01:56.999 [213/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.999 [214/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:56.999 [215/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.999 [216/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:57.259 [217/265] Linking static target lib/librte_ethdev.a 00:01:57.259 [218/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.259 [219/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.259 [220/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.519 [221/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.519 [222/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:57.780 [223/265] Linking static target lib/librte_vhost.a 00:01:57.780 [224/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.780 [225/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.780 [226/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.164 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.106 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.687 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.227 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.227 [231/265] Linking target lib/librte_eal.so.24.0 00:02:09.227 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:09.227 [233/265] Linking target lib/librte_meter.so.24.0 00:02:09.227 [234/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:09.227 [235/265] Linking target lib/librte_pci.so.24.0 00:02:09.227 [236/265] Linking target lib/librte_timer.so.24.0 00:02:09.227 [237/265] Linking target lib/librte_ring.so.24.0 00:02:09.227 [238/265] Linking target lib/librte_dmadev.so.24.0 00:02:09.487 [239/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:09.488 [240/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:09.488 [241/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:09.488 [242/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:09.488 [243/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:09.488 [244/265] Linking target lib/librte_rcu.so.24.0 00:02:09.488 [245/265] Linking target lib/librte_mempool.so.24.0 00:02:09.488 [246/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:09.488 [247/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:09.488 [248/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:09.748 [249/265] Linking target lib/librte_mbuf.so.24.0 00:02:09.748 [250/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:09.748 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:10.009 [252/265] Linking target lib/librte_reorder.so.24.0 00:02:10.009 [253/265] Linking target lib/librte_net.so.24.0 00:02:10.009 [254/265] Linking target lib/librte_compressdev.so.24.0 00:02:10.009 [255/265] Linking target lib/librte_cryptodev.so.24.0 00:02:10.009 [256/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:10.009 [257/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:10.009 [258/265] Linking target lib/librte_security.so.24.0 00:02:10.009 [259/265] Linking target lib/librte_hash.so.24.0 00:02:10.009 [260/265] Linking target lib/librte_cmdline.so.24.0 00:02:10.009 [261/265] Linking target lib/librte_ethdev.so.24.0 00:02:10.269 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:10.269 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:10.269 [264/265] Linking target lib/librte_power.so.24.0 00:02:10.269 [265/265] Linking target lib/librte_vhost.so.24.0 00:02:10.269 INFO: autodetecting backend as ninja 00:02:10.269 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:11.211 CC lib/ut/ut.o 00:02:11.211 CC lib/ut_mock/mock.o 00:02:11.211 CC lib/log/log.o 00:02:11.211 CC lib/log/log_flags.o 00:02:11.211 CC lib/log/log_deprecated.o 00:02:11.472 LIB libspdk_ut_mock.a 00:02:11.472 LIB libspdk_log.a 00:02:11.472 LIB libspdk_ut.a 00:02:11.731 CC lib/util/base64.o 00:02:11.731 CC lib/util/cpuset.o 00:02:11.731 CC lib/util/bit_array.o 00:02:11.731 CC lib/util/crc16.o 00:02:11.731 CC lib/util/crc32.o 00:02:11.731 CC lib/util/crc32_ieee.o 00:02:11.731 CC lib/util/crc32c.o 00:02:11.731 CC lib/util/crc64.o 00:02:11.731 CC lib/util/dif.o 00:02:11.731 CC lib/util/hexlify.o 00:02:11.731 CC lib/util/fd.o 00:02:11.731 CC lib/util/file.o 00:02:11.731 CC lib/util/iov.o 00:02:11.731 CC lib/util/pipe.o 00:02:11.731 CC lib/util/math.o 00:02:11.731 CC lib/util/strerror_tls.o 00:02:11.731 CC lib/util/string.o 00:02:11.731 CC lib/ioat/ioat.o 00:02:11.731 CC lib/util/uuid.o 00:02:11.731 CC lib/util/fd_group.o 00:02:11.731 CC lib/util/xor.o 00:02:11.731 CC lib/util/zipf.o 00:02:11.731 CXX lib/trace_parser/trace.o 00:02:11.731 CC lib/dma/dma.o 00:02:11.991 CC lib/vfio_user/host/vfio_user_pci.o 00:02:11.991 CC lib/vfio_user/host/vfio_user.o 00:02:11.991 LIB libspdk_dma.a 00:02:11.991 LIB libspdk_ioat.a 00:02:12.251 LIB libspdk_vfio_user.a 00:02:12.251 LIB libspdk_util.a 00:02:12.511 CC lib/json/json_parse.o 00:02:12.511 CC lib/json/json_util.o 00:02:12.511 CC lib/conf/conf.o 00:02:12.511 CC lib/json/json_write.o 00:02:12.511 CC lib/rdma/common.o 00:02:12.511 CC lib/rdma/rdma_verbs.o 00:02:12.511 CC lib/vmd/vmd.o 00:02:12.511 CC lib/vmd/led.o 00:02:12.511 CC lib/env_dpdk/env.o 00:02:12.511 CC lib/env_dpdk/memory.o 00:02:12.511 CC lib/env_dpdk/threads.o 00:02:12.511 CC lib/env_dpdk/pci.o 00:02:12.511 CC lib/env_dpdk/init.o 00:02:12.511 CC lib/env_dpdk/pci_ioat.o 00:02:12.511 CC lib/idxd/idxd.o 00:02:12.511 CC lib/env_dpdk/pci_idxd.o 00:02:12.511 CC lib/env_dpdk/pci_virtio.o 00:02:12.511 CC lib/idxd/idxd_user.o 00:02:12.511 CC lib/env_dpdk/pci_vmd.o 00:02:12.511 CC lib/env_dpdk/pci_event.o 00:02:12.511 CC lib/env_dpdk/sigbus_handler.o 00:02:12.511 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:12.511 CC lib/env_dpdk/pci_dpdk.o 00:02:12.511 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:12.511 LIB libspdk_trace_parser.a 00:02:12.771 LIB libspdk_conf.a 00:02:12.771 LIB libspdk_rdma.a 00:02:12.771 LIB libspdk_json.a 00:02:12.771 LIB libspdk_idxd.a 00:02:12.771 LIB libspdk_vmd.a 00:02:13.032 CC lib/jsonrpc/jsonrpc_server.o 00:02:13.032 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:13.032 CC lib/jsonrpc/jsonrpc_client.o 00:02:13.032 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:13.293 LIB libspdk_jsonrpc.a 00:02:13.553 CC lib/rpc/rpc.o 00:02:13.553 LIB libspdk_env_dpdk.a 00:02:13.553 LIB libspdk_rpc.a 00:02:14.125 CC lib/trace/trace.o 00:02:14.125 CC lib/trace/trace_flags.o 00:02:14.125 CC lib/trace/trace_rpc.o 00:02:14.125 CC lib/keyring/keyring.o 00:02:14.125 CC lib/keyring/keyring_rpc.o 00:02:14.125 CC lib/notify/notify.o 00:02:14.125 CC lib/notify/notify_rpc.o 00:02:14.125 LIB libspdk_notify.a 00:02:14.125 LIB libspdk_trace.a 00:02:14.125 LIB libspdk_keyring.a 00:02:14.387 CC lib/sock/sock.o 00:02:14.387 CC lib/sock/sock_rpc.o 00:02:14.387 CC lib/thread/iobuf.o 00:02:14.387 CC lib/thread/thread.o 00:02:14.649 LIB libspdk_sock.a 00:02:14.909 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:14.909 CC lib/nvme/nvme_ctrlr.o 00:02:14.909 CC lib/nvme/nvme_fabric.o 00:02:14.909 CC lib/nvme/nvme_ns_cmd.o 00:02:14.909 CC lib/nvme/nvme_ns.o 00:02:14.909 CC lib/nvme/nvme_pcie.o 00:02:14.909 CC lib/nvme/nvme_pcie_common.o 00:02:14.909 CC lib/nvme/nvme_qpair.o 00:02:14.909 CC lib/nvme/nvme.o 00:02:14.909 CC lib/nvme/nvme_quirks.o 00:02:14.909 CC lib/nvme/nvme_transport.o 00:02:14.909 CC lib/nvme/nvme_discovery.o 00:02:14.909 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:14.909 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:14.909 CC lib/nvme/nvme_tcp.o 00:02:14.909 CC lib/nvme/nvme_opal.o 00:02:14.909 CC lib/nvme/nvme_io_msg.o 00:02:14.909 CC lib/nvme/nvme_poll_group.o 00:02:14.909 CC lib/nvme/nvme_zns.o 00:02:14.909 CC lib/nvme/nvme_stubs.o 00:02:14.909 CC lib/nvme/nvme_auth.o 00:02:14.909 CC lib/nvme/nvme_cuse.o 00:02:14.909 CC lib/nvme/nvme_vfio_user.o 00:02:14.909 CC lib/nvme/nvme_rdma.o 00:02:15.169 LIB libspdk_thread.a 00:02:15.429 CC lib/blob/request.o 00:02:15.429 CC lib/blob/blobstore.o 00:02:15.429 CC lib/blob/zeroes.o 00:02:15.429 CC lib/init/json_config.o 00:02:15.429 CC lib/init/subsystem.o 00:02:15.429 CC lib/blob/blob_bs_dev.o 00:02:15.429 CC lib/init/rpc.o 00:02:15.429 CC lib/init/subsystem_rpc.o 00:02:15.429 CC lib/virtio/virtio.o 00:02:15.429 CC lib/virtio/virtio_vhost_user.o 00:02:15.429 CC lib/virtio/virtio_vfio_user.o 00:02:15.429 CC lib/virtio/virtio_pci.o 00:02:15.690 CC lib/accel/accel.o 00:02:15.690 CC lib/accel/accel_sw.o 00:02:15.690 CC lib/accel/accel_rpc.o 00:02:15.690 CC lib/vfu_tgt/tgt_endpoint.o 00:02:15.690 CC lib/vfu_tgt/tgt_rpc.o 00:02:15.690 LIB libspdk_init.a 00:02:15.690 LIB libspdk_virtio.a 00:02:15.690 LIB libspdk_vfu_tgt.a 00:02:15.951 CC lib/event/app.o 00:02:15.951 CC lib/event/reactor.o 00:02:15.951 CC lib/event/log_rpc.o 00:02:15.951 CC lib/event/app_rpc.o 00:02:15.951 CC lib/event/scheduler_static.o 00:02:16.211 LIB libspdk_accel.a 00:02:16.211 LIB libspdk_event.a 00:02:16.211 LIB libspdk_nvme.a 00:02:16.471 CC lib/bdev/bdev.o 00:02:16.471 CC lib/bdev/bdev_rpc.o 00:02:16.471 CC lib/bdev/bdev_zone.o 00:02:16.731 CC lib/bdev/part.o 00:02:16.731 CC lib/bdev/scsi_nvme.o 00:02:17.302 LIB libspdk_blob.a 00:02:17.561 CC lib/blobfs/blobfs.o 00:02:17.561 CC lib/blobfs/tree.o 00:02:17.561 CC lib/lvol/lvol.o 00:02:18.129 LIB libspdk_lvol.a 00:02:18.129 LIB libspdk_blobfs.a 00:02:18.388 LIB libspdk_bdev.a 00:02:18.652 CC lib/nbd/nbd.o 00:02:18.652 CC lib/nbd/nbd_rpc.o 00:02:18.652 CC lib/nvmf/ctrlr.o 00:02:18.652 CC lib/nvmf/ctrlr_discovery.o 00:02:18.652 CC lib/nvmf/ctrlr_bdev.o 00:02:18.652 CC lib/nvmf/subsystem.o 00:02:18.652 CC lib/nvmf/nvmf.o 00:02:18.652 CC lib/nvmf/nvmf_rpc.o 00:02:18.652 CC lib/scsi/dev.o 00:02:18.652 CC lib/nvmf/transport.o 00:02:18.652 CC lib/scsi/lun.o 00:02:18.652 CC lib/nvmf/tcp.o 00:02:18.652 CC lib/scsi/port.o 00:02:18.652 CC lib/nvmf/vfio_user.o 00:02:18.652 CC lib/ublk/ublk.o 00:02:18.652 CC lib/scsi/scsi.o 00:02:18.652 CC lib/ublk/ublk_rpc.o 00:02:18.652 CC lib/nvmf/rdma.o 00:02:18.652 CC lib/scsi/scsi_bdev.o 00:02:18.652 CC lib/scsi/scsi_pr.o 00:02:18.652 CC lib/ftl/ftl_core.o 00:02:18.652 CC lib/scsi/scsi_rpc.o 00:02:18.652 CC lib/ftl/ftl_init.o 00:02:18.652 CC lib/scsi/task.o 00:02:18.652 CC lib/ftl/ftl_layout.o 00:02:18.652 CC lib/ftl/ftl_debug.o 00:02:18.652 CC lib/ftl/ftl_io.o 00:02:18.652 CC lib/ftl/ftl_sb.o 00:02:18.652 CC lib/ftl/ftl_l2p.o 00:02:18.652 CC lib/ftl/ftl_l2p_flat.o 00:02:18.652 CC lib/ftl/ftl_nv_cache.o 00:02:18.652 CC lib/ftl/ftl_band.o 00:02:18.652 CC lib/ftl/ftl_band_ops.o 00:02:18.652 CC lib/ftl/ftl_rq.o 00:02:18.652 CC lib/ftl/ftl_writer.o 00:02:18.652 CC lib/ftl/ftl_reloc.o 00:02:18.652 CC lib/ftl/ftl_l2p_cache.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt.o 00:02:18.652 CC lib/ftl/ftl_p2l.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:18.652 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:18.652 CC lib/ftl/utils/ftl_conf.o 00:02:18.652 CC lib/ftl/utils/ftl_md.o 00:02:18.652 CC lib/ftl/utils/ftl_mempool.o 00:02:18.652 CC lib/ftl/utils/ftl_bitmap.o 00:02:18.652 CC lib/ftl/utils/ftl_property.o 00:02:18.652 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:18.652 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:18.652 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:18.652 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:18.652 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:18.652 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:18.652 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:18.652 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:18.652 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:18.652 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:18.652 CC lib/ftl/base/ftl_base_dev.o 00:02:18.652 CC lib/ftl/base/ftl_base_bdev.o 00:02:18.652 CC lib/ftl/ftl_trace.o 00:02:19.221 LIB libspdk_nbd.a 00:02:19.221 LIB libspdk_scsi.a 00:02:19.221 LIB libspdk_ublk.a 00:02:19.481 LIB libspdk_ftl.a 00:02:19.481 CC lib/iscsi/conn.o 00:02:19.481 CC lib/iscsi/init_grp.o 00:02:19.481 CC lib/iscsi/iscsi.o 00:02:19.481 CC lib/iscsi/md5.o 00:02:19.481 CC lib/iscsi/iscsi_subsystem.o 00:02:19.481 CC lib/iscsi/param.o 00:02:19.481 CC lib/iscsi/portal_grp.o 00:02:19.481 CC lib/iscsi/tgt_node.o 00:02:19.481 CC lib/iscsi/task.o 00:02:19.481 CC lib/iscsi/iscsi_rpc.o 00:02:19.481 CC lib/vhost/vhost.o 00:02:19.481 CC lib/vhost/vhost_rpc.o 00:02:19.481 CC lib/vhost/vhost_blk.o 00:02:19.481 CC lib/vhost/vhost_scsi.o 00:02:19.481 CC lib/vhost/rte_vhost_user.o 00:02:20.050 LIB libspdk_nvmf.a 00:02:20.050 LIB libspdk_vhost.a 00:02:20.309 LIB libspdk_iscsi.a 00:02:20.880 CC module/vfu_device/vfu_virtio_blk.o 00:02:20.880 CC module/vfu_device/vfu_virtio.o 00:02:20.880 CC module/vfu_device/vfu_virtio_scsi.o 00:02:20.880 CC module/vfu_device/vfu_virtio_rpc.o 00:02:20.880 CC module/env_dpdk/env_dpdk_rpc.o 00:02:20.880 CC module/accel/ioat/accel_ioat_rpc.o 00:02:20.880 CC module/accel/ioat/accel_ioat.o 00:02:20.880 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:20.880 CC module/sock/posix/posix.o 00:02:20.880 CC module/keyring/file/keyring.o 00:02:20.880 CC module/accel/error/accel_error_rpc.o 00:02:20.880 CC module/accel/error/accel_error.o 00:02:20.880 CC module/keyring/file/keyring_rpc.o 00:02:20.880 CC module/blob/bdev/blob_bdev.o 00:02:20.880 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:20.880 CC module/accel/dsa/accel_dsa_rpc.o 00:02:20.880 CC module/accel/dsa/accel_dsa.o 00:02:20.880 LIB libspdk_env_dpdk_rpc.a 00:02:20.880 CC module/accel/iaa/accel_iaa_rpc.o 00:02:20.880 CC module/accel/iaa/accel_iaa.o 00:02:20.880 CC module/scheduler/gscheduler/gscheduler.o 00:02:21.139 LIB libspdk_keyring_file.a 00:02:21.139 LIB libspdk_scheduler_dynamic.a 00:02:21.139 LIB libspdk_accel_error.a 00:02:21.139 LIB libspdk_scheduler_dpdk_governor.a 00:02:21.139 LIB libspdk_accel_ioat.a 00:02:21.139 LIB libspdk_scheduler_gscheduler.a 00:02:21.139 LIB libspdk_accel_iaa.a 00:02:21.139 LIB libspdk_blob_bdev.a 00:02:21.139 LIB libspdk_accel_dsa.a 00:02:21.139 LIB libspdk_vfu_device.a 00:02:21.397 LIB libspdk_sock_posix.a 00:02:21.397 CC module/bdev/null/bdev_null.o 00:02:21.398 CC module/bdev/null/bdev_null_rpc.o 00:02:21.398 CC module/bdev/iscsi/bdev_iscsi.o 00:02:21.398 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:21.656 CC module/bdev/raid/bdev_raid.o 00:02:21.656 CC module/bdev/gpt/gpt.o 00:02:21.656 CC module/bdev/gpt/vbdev_gpt.o 00:02:21.656 CC module/bdev/raid/bdev_raid_sb.o 00:02:21.656 CC module/bdev/malloc/bdev_malloc.o 00:02:21.656 CC module/bdev/raid/bdev_raid_rpc.o 00:02:21.656 CC module/bdev/raid/concat.o 00:02:21.656 CC module/bdev/aio/bdev_aio_rpc.o 00:02:21.656 CC module/bdev/raid/raid1.o 00:02:21.656 CC module/bdev/split/vbdev_split_rpc.o 00:02:21.656 CC module/bdev/raid/raid0.o 00:02:21.656 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:21.656 CC module/bdev/aio/bdev_aio.o 00:02:21.656 CC module/bdev/split/vbdev_split.o 00:02:21.656 CC module/bdev/error/vbdev_error.o 00:02:21.656 CC module/bdev/error/vbdev_error_rpc.o 00:02:21.656 CC module/bdev/ftl/bdev_ftl.o 00:02:21.656 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:21.656 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:21.656 CC module/bdev/lvol/vbdev_lvol.o 00:02:21.656 CC module/bdev/nvme/bdev_nvme.o 00:02:21.656 CC module/bdev/delay/vbdev_delay.o 00:02:21.656 CC module/bdev/nvme/nvme_rpc.o 00:02:21.656 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:21.656 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:21.656 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:21.656 CC module/bdev/nvme/bdev_mdns_client.o 00:02:21.656 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:21.656 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:21.656 CC module/blobfs/bdev/blobfs_bdev.o 00:02:21.656 CC module/bdev/nvme/vbdev_opal.o 00:02:21.656 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:21.656 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:21.656 CC module/bdev/passthru/vbdev_passthru.o 00:02:21.656 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:21.656 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:21.656 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:21.656 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:21.656 LIB libspdk_bdev_gpt.a 00:02:21.656 LIB libspdk_bdev_ftl.a 00:02:21.656 LIB libspdk_blobfs_bdev.a 00:02:21.656 LIB libspdk_bdev_split.a 00:02:21.656 LIB libspdk_bdev_error.a 00:02:21.915 LIB libspdk_bdev_iscsi.a 00:02:21.915 LIB libspdk_bdev_passthru.a 00:02:21.915 LIB libspdk_bdev_aio.a 00:02:21.915 LIB libspdk_bdev_zone_block.a 00:02:21.915 LIB libspdk_bdev_malloc.a 00:02:21.915 LIB libspdk_bdev_null.a 00:02:21.915 LIB libspdk_bdev_delay.a 00:02:21.915 LIB libspdk_bdev_virtio.a 00:02:21.915 LIB libspdk_bdev_lvol.a 00:02:22.175 LIB libspdk_bdev_raid.a 00:02:22.743 LIB libspdk_bdev_nvme.a 00:02:23.683 CC module/event/subsystems/iobuf/iobuf.o 00:02:23.683 CC module/event/subsystems/sock/sock.o 00:02:23.683 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:23.683 CC module/event/subsystems/vmd/vmd.o 00:02:23.683 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:23.683 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:23.684 CC module/event/subsystems/scheduler/scheduler.o 00:02:23.684 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:23.684 CC module/event/subsystems/keyring/keyring.o 00:02:23.684 LIB libspdk_event_sock.a 00:02:23.684 LIB libspdk_event_vmd.a 00:02:23.684 LIB libspdk_event_vhost_blk.a 00:02:23.684 LIB libspdk_event_vfu_tgt.a 00:02:23.684 LIB libspdk_event_keyring.a 00:02:23.684 LIB libspdk_event_iobuf.a 00:02:23.684 LIB libspdk_event_scheduler.a 00:02:23.944 CC module/event/subsystems/accel/accel.o 00:02:23.944 LIB libspdk_event_accel.a 00:02:24.204 CC module/event/subsystems/bdev/bdev.o 00:02:24.464 LIB libspdk_event_bdev.a 00:02:24.723 CC module/event/subsystems/ublk/ublk.o 00:02:24.723 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:24.723 CC module/event/subsystems/nbd/nbd.o 00:02:24.724 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:24.724 CC module/event/subsystems/scsi/scsi.o 00:02:24.983 LIB libspdk_event_ublk.a 00:02:24.983 LIB libspdk_event_nbd.a 00:02:24.983 LIB libspdk_event_scsi.a 00:02:24.983 LIB libspdk_event_nvmf.a 00:02:25.243 CC module/event/subsystems/iscsi/iscsi.o 00:02:25.243 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:25.243 LIB libspdk_event_vhost_scsi.a 00:02:25.243 LIB libspdk_event_iscsi.a 00:02:25.815 CC app/spdk_nvme_identify/identify.o 00:02:25.815 CC app/spdk_lspci/spdk_lspci.o 00:02:25.815 CC app/spdk_nvme_discover/discovery_aer.o 00:02:25.815 CC app/trace_record/trace_record.o 00:02:25.815 CXX app/trace/trace.o 00:02:25.815 CC app/spdk_nvme_perf/perf.o 00:02:25.815 CC test/rpc_client/rpc_client_test.o 00:02:25.815 CC app/spdk_top/spdk_top.o 00:02:25.815 TEST_HEADER include/spdk/accel.h 00:02:25.815 TEST_HEADER include/spdk/assert.h 00:02:25.815 TEST_HEADER include/spdk/accel_module.h 00:02:25.815 TEST_HEADER include/spdk/barrier.h 00:02:25.815 TEST_HEADER include/spdk/base64.h 00:02:25.815 TEST_HEADER include/spdk/bdev.h 00:02:25.815 TEST_HEADER include/spdk/bdev_module.h 00:02:25.815 TEST_HEADER include/spdk/bit_array.h 00:02:25.815 TEST_HEADER include/spdk/bdev_zone.h 00:02:25.815 TEST_HEADER include/spdk/bit_pool.h 00:02:25.815 TEST_HEADER include/spdk/blob_bdev.h 00:02:25.815 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:25.815 TEST_HEADER include/spdk/blobfs.h 00:02:25.815 TEST_HEADER include/spdk/blob.h 00:02:25.815 TEST_HEADER include/spdk/conf.h 00:02:25.815 TEST_HEADER include/spdk/config.h 00:02:25.815 TEST_HEADER include/spdk/cpuset.h 00:02:25.815 TEST_HEADER include/spdk/crc16.h 00:02:25.815 TEST_HEADER include/spdk/crc32.h 00:02:25.815 TEST_HEADER include/spdk/crc64.h 00:02:25.815 TEST_HEADER include/spdk/dif.h 00:02:25.815 TEST_HEADER include/spdk/dma.h 00:02:25.815 TEST_HEADER include/spdk/endian.h 00:02:25.815 TEST_HEADER include/spdk/env_dpdk.h 00:02:25.815 TEST_HEADER include/spdk/env.h 00:02:25.815 TEST_HEADER include/spdk/event.h 00:02:25.815 CC app/iscsi_tgt/iscsi_tgt.o 00:02:25.815 TEST_HEADER include/spdk/fd_group.h 00:02:25.815 CC app/nvmf_tgt/nvmf_main.o 00:02:25.815 CC app/spdk_dd/spdk_dd.o 00:02:25.815 TEST_HEADER include/spdk/fd.h 00:02:25.815 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:25.815 TEST_HEADER include/spdk/file.h 00:02:25.815 CC app/vhost/vhost.o 00:02:25.815 TEST_HEADER include/spdk/ftl.h 00:02:25.815 TEST_HEADER include/spdk/gpt_spec.h 00:02:25.815 TEST_HEADER include/spdk/hexlify.h 00:02:25.815 TEST_HEADER include/spdk/histogram_data.h 00:02:25.815 TEST_HEADER include/spdk/idxd.h 00:02:25.815 TEST_HEADER include/spdk/idxd_spec.h 00:02:25.815 TEST_HEADER include/spdk/init.h 00:02:25.815 CC app/spdk_tgt/spdk_tgt.o 00:02:25.815 TEST_HEADER include/spdk/ioat.h 00:02:25.815 TEST_HEADER include/spdk/ioat_spec.h 00:02:25.815 TEST_HEADER include/spdk/iscsi_spec.h 00:02:25.815 TEST_HEADER include/spdk/json.h 00:02:25.815 TEST_HEADER include/spdk/jsonrpc.h 00:02:25.815 TEST_HEADER include/spdk/keyring.h 00:02:25.815 TEST_HEADER include/spdk/keyring_module.h 00:02:25.815 TEST_HEADER include/spdk/likely.h 00:02:25.815 TEST_HEADER include/spdk/log.h 00:02:25.815 TEST_HEADER include/spdk/lvol.h 00:02:25.815 CC test/app/stub/stub.o 00:02:25.815 CC app/fio/nvme/fio_plugin.o 00:02:25.815 TEST_HEADER include/spdk/memory.h 00:02:25.815 CC test/event/reactor_perf/reactor_perf.o 00:02:25.815 CC examples/ioat/perf/perf.o 00:02:25.815 TEST_HEADER include/spdk/mmio.h 00:02:25.815 CC test/event/reactor/reactor.o 00:02:25.815 CC test/env/pci/pci_ut.o 00:02:25.815 TEST_HEADER include/spdk/nbd.h 00:02:25.815 CC test/thread/lock/spdk_lock.o 00:02:25.815 CC test/app/histogram_perf/histogram_perf.o 00:02:25.815 CC test/env/vtophys/vtophys.o 00:02:25.815 TEST_HEADER include/spdk/notify.h 00:02:25.815 CC examples/vmd/led/led.o 00:02:25.815 TEST_HEADER include/spdk/nvme.h 00:02:25.815 CC test/env/memory/memory_ut.o 00:02:25.815 CC test/nvme/reset/reset.o 00:02:25.815 CC test/app/jsoncat/jsoncat.o 00:02:25.815 CC examples/idxd/perf/perf.o 00:02:25.815 CC examples/nvme/hotplug/hotplug.o 00:02:25.815 TEST_HEADER include/spdk/nvme_intel.h 00:02:25.815 CC test/thread/poller_perf/poller_perf.o 00:02:25.815 CC test/event/event_perf/event_perf.o 00:02:25.815 CC test/nvme/aer/aer.o 00:02:25.815 CC examples/util/zipf/zipf.o 00:02:25.815 CC examples/accel/perf/accel_perf.o 00:02:25.815 CC test/nvme/sgl/sgl.o 00:02:25.815 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:25.815 CC test/nvme/overhead/overhead.o 00:02:25.815 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:25.815 CC examples/ioat/verify/verify.o 00:02:25.815 CC test/nvme/e2edp/nvme_dp.o 00:02:25.815 CC examples/vmd/lsvmd/lsvmd.o 00:02:25.815 CC examples/sock/hello_world/hello_sock.o 00:02:25.815 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:25.815 CC test/nvme/simple_copy/simple_copy.o 00:02:25.815 CC examples/nvme/hello_world/hello_world.o 00:02:25.815 CC examples/nvme/abort/abort.o 00:02:25.815 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:25.815 TEST_HEADER include/spdk/nvme_spec.h 00:02:25.815 CC examples/nvme/arbitration/arbitration.o 00:02:25.815 CC test/nvme/err_injection/err_injection.o 00:02:25.815 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:25.815 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:25.815 CC examples/nvme/reconnect/reconnect.o 00:02:25.815 CC test/nvme/startup/startup.o 00:02:25.815 CC test/nvme/reserve/reserve.o 00:02:25.815 CC test/nvme/boot_partition/boot_partition.o 00:02:25.815 CC test/nvme/connect_stress/connect_stress.o 00:02:25.815 TEST_HEADER include/spdk/nvme_zns.h 00:02:25.815 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:25.815 CC test/event/app_repeat/app_repeat.o 00:02:25.815 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:25.815 LINK spdk_lspci 00:02:25.815 TEST_HEADER include/spdk/nvmf.h 00:02:25.815 TEST_HEADER include/spdk/nvmf_spec.h 00:02:25.815 TEST_HEADER include/spdk/nvmf_transport.h 00:02:25.815 CC app/fio/bdev/fio_plugin.o 00:02:25.815 TEST_HEADER include/spdk/opal.h 00:02:25.815 CC test/event/scheduler/scheduler.o 00:02:25.815 TEST_HEADER include/spdk/opal_spec.h 00:02:25.815 CC examples/nvmf/nvmf/nvmf.o 00:02:25.815 TEST_HEADER include/spdk/pci_ids.h 00:02:25.815 CC examples/blob/hello_world/hello_blob.o 00:02:25.815 CC test/accel/dif/dif.o 00:02:25.815 TEST_HEADER include/spdk/pipe.h 00:02:25.815 TEST_HEADER include/spdk/queue.h 00:02:25.815 CC examples/bdev/hello_world/hello_bdev.o 00:02:25.815 CC examples/blob/cli/blobcli.o 00:02:25.815 TEST_HEADER include/spdk/reduce.h 00:02:25.815 TEST_HEADER include/spdk/rpc.h 00:02:25.815 CC test/app/bdev_svc/bdev_svc.o 00:02:25.815 CC examples/bdev/bdevperf/bdevperf.o 00:02:25.815 CC test/bdev/bdevio/bdevio.o 00:02:25.815 TEST_HEADER include/spdk/scheduler.h 00:02:25.815 CC test/dma/test_dma/test_dma.o 00:02:25.815 CC test/blobfs/mkfs/mkfs.o 00:02:25.815 TEST_HEADER include/spdk/scsi.h 00:02:25.815 TEST_HEADER include/spdk/scsi_spec.h 00:02:25.815 TEST_HEADER include/spdk/sock.h 00:02:25.815 TEST_HEADER include/spdk/stdinc.h 00:02:25.815 TEST_HEADER include/spdk/string.h 00:02:25.815 CC examples/thread/thread/thread_ex.o 00:02:25.815 TEST_HEADER include/spdk/thread.h 00:02:25.815 TEST_HEADER include/spdk/trace.h 00:02:25.815 TEST_HEADER include/spdk/trace_parser.h 00:02:25.815 TEST_HEADER include/spdk/tree.h 00:02:25.816 TEST_HEADER include/spdk/ublk.h 00:02:25.816 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:25.816 TEST_HEADER include/spdk/util.h 00:02:25.816 TEST_HEADER include/spdk/uuid.h 00:02:25.816 TEST_HEADER include/spdk/version.h 00:02:25.816 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:25.816 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:25.816 LINK rpc_client_test 00:02:25.816 TEST_HEADER include/spdk/vhost.h 00:02:25.816 TEST_HEADER include/spdk/vmd.h 00:02:25.816 TEST_HEADER include/spdk/xor.h 00:02:25.816 TEST_HEADER include/spdk/zipf.h 00:02:26.087 CXX test/cpp_headers/accel.o 00:02:26.087 CC test/env/mem_callbacks/mem_callbacks.o 00:02:26.087 CC test/lvol/esnap/esnap.o 00:02:26.087 LINK spdk_nvme_discover 00:02:26.087 LINK spdk_trace_record 00:02:26.087 LINK nvmf_tgt 00:02:26.087 LINK reactor_perf 00:02:26.087 LINK interrupt_tgt 00:02:26.087 LINK histogram_perf 00:02:26.087 LINK reactor 00:02:26.087 LINK jsoncat 00:02:26.087 LINK vtophys 00:02:26.087 LINK lsvmd 00:02:26.087 LINK event_perf 00:02:26.087 LINK vhost 00:02:26.087 LINK zipf 00:02:26.087 LINK led 00:02:26.087 LINK poller_perf 00:02:26.087 LINK stub 00:02:26.087 LINK iscsi_tgt 00:02:26.087 LINK env_dpdk_post_init 00:02:26.087 LINK spdk_tgt 00:02:26.087 LINK app_repeat 00:02:26.087 LINK startup 00:02:26.087 LINK pmr_persistence 00:02:26.087 LINK boot_partition 00:02:26.087 LINK connect_stress 00:02:26.087 LINK err_injection 00:02:26.087 LINK ioat_perf 00:02:26.087 LINK hotplug 00:02:26.087 LINK verify 00:02:26.087 LINK reserve 00:02:26.087 LINK hello_world 00:02:26.087 LINK bdev_svc 00:02:26.087 LINK cmb_copy 00:02:26.087 LINK hello_sock 00:02:26.087 LINK mkfs 00:02:26.087 LINK aer 00:02:26.087 LINK sgl 00:02:26.087 LINK hello_blob 00:02:26.087 LINK simple_copy 00:02:26.088 LINK scheduler 00:02:26.088 LINK spdk_trace 00:02:26.088 LINK nvme_dp 00:02:26.088 LINK hello_bdev 00:02:26.088 CXX test/cpp_headers/accel_module.o 00:02:26.088 LINK reset 00:02:26.360 LINK overhead 00:02:26.360 LINK thread 00:02:26.360 LINK nvmf 00:02:26.360 LINK abort 00:02:26.360 LINK idxd_perf 00:02:26.360 LINK arbitration 00:02:26.360 LINK pci_ut 00:02:26.360 LINK reconnect 00:02:26.360 LINK test_dma 00:02:26.360 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:02:26.360 struct spdk_nvme_fdp_ruhs ruhs; 00:02:26.360 ^ 00:02:26.360 LINK spdk_dd 00:02:26.360 LINK bdevio 00:02:26.360 LINK nvme_fuzz 00:02:26.360 LINK dif 00:02:26.360 LINK accel_perf 00:02:26.624 CXX test/cpp_headers/assert.o 00:02:26.624 LINK nvme_manage 00:02:26.624 CXX test/cpp_headers/barrier.o 00:02:26.624 LINK blobcli 00:02:26.624 LINK mem_callbacks 00:02:26.624 LINK spdk_nvme_perf 00:02:26.624 CC test/nvme/compliance/nvme_compliance.o 00:02:26.624 CXX test/cpp_headers/base64.o 00:02:26.624 CXX test/cpp_headers/bdev.o 00:02:26.886 1 warning generated. 00:02:26.886 LINK spdk_bdev 00:02:26.886 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:26.886 LINK spdk_nvme_identify 00:02:26.886 CC test/nvme/fused_ordering/fused_ordering.o 00:02:26.886 LINK spdk_nvme 00:02:26.886 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:26.886 LINK memory_ut 00:02:26.886 CC test/nvme/fdp/fdp.o 00:02:26.887 CXX test/cpp_headers/bdev_module.o 00:02:26.887 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:26.887 CXX test/cpp_headers/bdev_zone.o 00:02:26.887 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:26.887 CC test/nvme/cuse/cuse.o 00:02:26.887 CXX test/cpp_headers/bit_array.o 00:02:26.887 CXX test/cpp_headers/bit_pool.o 00:02:26.887 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:26.887 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:27.153 LINK bdevperf 00:02:27.153 CXX test/cpp_headers/blob_bdev.o 00:02:27.153 CXX test/cpp_headers/blobfs.o 00:02:27.153 CXX test/cpp_headers/blobfs_bdev.o 00:02:27.153 CXX test/cpp_headers/blob.o 00:02:27.153 LINK fused_ordering 00:02:27.153 CXX test/cpp_headers/conf.o 00:02:27.153 CXX test/cpp_headers/config.o 00:02:27.153 CXX test/cpp_headers/cpuset.o 00:02:27.153 CXX test/cpp_headers/crc16.o 00:02:27.153 LINK doorbell_aers 00:02:27.153 LINK spdk_top 00:02:27.153 CXX test/cpp_headers/crc32.o 00:02:27.153 LINK nvme_compliance 00:02:27.153 CXX test/cpp_headers/crc64.o 00:02:27.153 CXX test/cpp_headers/dif.o 00:02:27.153 CXX test/cpp_headers/dma.o 00:02:27.153 CXX test/cpp_headers/endian.o 00:02:27.153 CXX test/cpp_headers/env.o 00:02:27.153 CXX test/cpp_headers/env_dpdk.o 00:02:27.153 LINK fdp 00:02:27.153 CXX test/cpp_headers/event.o 00:02:27.153 CXX test/cpp_headers/fd_group.o 00:02:27.153 CXX test/cpp_headers/fd.o 00:02:27.421 CXX test/cpp_headers/ftl.o 00:02:27.421 CXX test/cpp_headers/file.o 00:02:27.421 CXX test/cpp_headers/gpt_spec.o 00:02:27.421 CXX test/cpp_headers/hexlify.o 00:02:27.421 CXX test/cpp_headers/histogram_data.o 00:02:27.421 LINK llvm_vfio_fuzz 00:02:27.421 CXX test/cpp_headers/idxd.o 00:02:27.421 CXX test/cpp_headers/idxd_spec.o 00:02:27.421 CXX test/cpp_headers/init.o 00:02:27.421 CXX test/cpp_headers/ioat.o 00:02:27.421 CXX test/cpp_headers/ioat_spec.o 00:02:27.421 CXX test/cpp_headers/iscsi_spec.o 00:02:27.421 CXX test/cpp_headers/json.o 00:02:27.421 CXX test/cpp_headers/jsonrpc.o 00:02:27.421 CXX test/cpp_headers/keyring.o 00:02:27.421 CXX test/cpp_headers/keyring_module.o 00:02:27.421 CXX test/cpp_headers/likely.o 00:02:27.421 CXX test/cpp_headers/log.o 00:02:27.421 CXX test/cpp_headers/lvol.o 00:02:27.421 CXX test/cpp_headers/memory.o 00:02:27.421 CXX test/cpp_headers/mmio.o 00:02:27.421 CXX test/cpp_headers/nbd.o 00:02:27.686 LINK vhost_fuzz 00:02:27.686 CXX test/cpp_headers/notify.o 00:02:27.686 CXX test/cpp_headers/nvme.o 00:02:27.686 CXX test/cpp_headers/nvme_intel.o 00:02:27.686 LINK spdk_lock 00:02:27.686 CXX test/cpp_headers/nvme_ocssd.o 00:02:27.686 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:27.686 CXX test/cpp_headers/nvme_spec.o 00:02:27.686 CXX test/cpp_headers/nvme_zns.o 00:02:27.686 CXX test/cpp_headers/nvmf_cmd.o 00:02:27.686 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:27.686 CXX test/cpp_headers/nvmf.o 00:02:27.686 CXX test/cpp_headers/nvmf_spec.o 00:02:27.686 CXX test/cpp_headers/nvmf_transport.o 00:02:27.686 CXX test/cpp_headers/opal.o 00:02:27.686 CXX test/cpp_headers/opal_spec.o 00:02:27.686 CXX test/cpp_headers/pci_ids.o 00:02:27.686 CXX test/cpp_headers/pipe.o 00:02:27.686 LINK llvm_nvme_fuzz 00:02:27.686 CXX test/cpp_headers/queue.o 00:02:27.686 CXX test/cpp_headers/reduce.o 00:02:27.686 CXX test/cpp_headers/rpc.o 00:02:27.686 CXX test/cpp_headers/scheduler.o 00:02:27.686 CXX test/cpp_headers/scsi.o 00:02:27.686 CXX test/cpp_headers/scsi_spec.o 00:02:27.686 CXX test/cpp_headers/sock.o 00:02:27.686 CXX test/cpp_headers/stdinc.o 00:02:27.686 CXX test/cpp_headers/string.o 00:02:27.686 CXX test/cpp_headers/thread.o 00:02:27.686 CXX test/cpp_headers/trace.o 00:02:27.686 CXX test/cpp_headers/trace_parser.o 00:02:27.686 CXX test/cpp_headers/tree.o 00:02:27.686 CXX test/cpp_headers/ublk.o 00:02:27.686 CXX test/cpp_headers/util.o 00:02:27.946 CXX test/cpp_headers/uuid.o 00:02:27.946 CXX test/cpp_headers/version.o 00:02:27.946 CXX test/cpp_headers/vfio_user_pci.o 00:02:27.946 CXX test/cpp_headers/vfio_user_spec.o 00:02:27.946 CXX test/cpp_headers/vhost.o 00:02:27.946 CXX test/cpp_headers/vmd.o 00:02:27.946 CXX test/cpp_headers/xor.o 00:02:27.946 CXX test/cpp_headers/zipf.o 00:02:28.206 LINK cuse 00:02:28.466 LINK iscsi_fuzz 00:02:30.376 LINK esnap 00:02:30.636 00:02:30.636 real 0m44.781s 00:02:30.636 user 6m51.659s 00:02:30.636 sys 2m37.366s 00:02:30.636 10:20:52 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:02:30.636 10:20:52 -- common/autotest_common.sh@10 -- $ set +x 00:02:30.636 ************************************ 00:02:30.636 END TEST make 00:02:30.636 ************************************ 00:02:30.636 10:20:52 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:30.636 10:20:52 -- pm/common@30 -- $ signal_monitor_resources TERM 00:02:30.636 10:20:52 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:02:30.636 10:20:52 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:30.636 10:20:52 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:30.636 10:20:52 -- pm/common@45 -- $ pid=83419 00:02:30.636 10:20:52 -- pm/common@52 -- $ sudo kill -TERM 83419 00:02:30.636 10:20:52 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:30.636 10:20:52 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:30.636 10:20:52 -- pm/common@45 -- $ pid=83421 00:02:30.636 10:20:52 -- pm/common@52 -- $ sudo kill -TERM 83421 00:02:30.897 10:20:52 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:30.897 10:20:52 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:30.897 10:20:52 -- pm/common@45 -- $ pid=83420 00:02:30.897 10:20:52 -- pm/common@52 -- $ sudo kill -TERM 83420 00:02:30.897 10:20:52 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:30.897 10:20:52 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:30.897 10:20:52 -- pm/common@46 -- $ [[ -n '' ]] 00:02:30.897 10:20:52 -- pm/common@49 -- $ continue 00:02:30.897 10:20:52 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:30.897 10:20:52 -- nvmf/common.sh@7 -- # uname -s 00:02:30.897 10:20:52 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:30.897 10:20:52 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:30.897 10:20:52 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:30.897 10:20:52 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:30.897 10:20:52 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:30.897 10:20:52 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:30.897 10:20:52 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:30.897 10:20:52 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:30.897 10:20:52 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:30.897 10:20:52 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:30.897 10:20:52 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:02:30.897 10:20:52 -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:02:30.897 10:20:52 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:30.897 10:20:52 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:30.897 10:20:52 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:30.897 10:20:52 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:30.897 10:20:52 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:30.897 10:20:52 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:30.897 10:20:52 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:30.897 10:20:52 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:30.897 10:20:52 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:30.897 10:20:52 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:30.897 10:20:52 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:30.897 10:20:52 -- paths/export.sh@5 -- # export PATH 00:02:30.897 10:20:52 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:30.897 10:20:53 -- nvmf/common.sh@47 -- # : 0 00:02:30.897 10:20:53 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:30.897 10:20:53 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:30.897 10:20:53 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:30.897 10:20:53 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:30.897 10:20:53 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:30.897 10:20:53 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:30.897 10:20:53 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:30.897 10:20:53 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:31.157 10:20:53 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:31.157 10:20:53 -- spdk/autotest.sh@32 -- # uname -s 00:02:31.157 10:20:53 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:31.157 10:20:53 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:31.157 10:20:53 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:31.157 10:20:53 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:31.157 10:20:53 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:31.157 10:20:53 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:31.157 10:20:53 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:31.157 10:20:53 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:31.157 10:20:53 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:31.158 10:20:53 -- spdk/autotest.sh@48 -- # udevadm_pid=141157 00:02:31.158 10:20:53 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:31.158 10:20:53 -- pm/common@17 -- # local monitor 00:02:31.158 10:20:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:31.158 10:20:53 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=141158 00:02:31.158 10:20:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:31.158 10:20:53 -- pm/common@21 -- # date +%s 00:02:31.158 10:20:53 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=141161 00:02:31.158 10:20:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:31.158 10:20:53 -- pm/common@21 -- # date +%s 00:02:31.158 10:20:53 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=141165 00:02:31.158 10:20:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:31.158 10:20:53 -- pm/common@21 -- # date +%s 00:02:31.158 10:20:53 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=141169 00:02:31.158 10:20:53 -- pm/common@26 -- # sleep 1 00:02:31.158 10:20:53 -- pm/common@21 -- # date +%s 00:02:31.158 10:20:53 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713514853 00:02:31.158 10:20:53 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713514853 00:02:31.158 10:20:53 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713514853 00:02:31.158 10:20:53 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713514853 00:02:31.158 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713514853_collect-vmstat.pm.log 00:02:31.158 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713514853_collect-bmc-pm.bmc.pm.log 00:02:31.158 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713514853_collect-cpu-load.pm.log 00:02:31.158 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713514853_collect-cpu-temp.pm.log 00:02:32.095 10:20:54 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:32.095 10:20:54 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:32.095 10:20:54 -- common/autotest_common.sh@710 -- # xtrace_disable 00:02:32.095 10:20:54 -- common/autotest_common.sh@10 -- # set +x 00:02:32.095 10:20:54 -- spdk/autotest.sh@59 -- # create_test_list 00:02:32.095 10:20:54 -- common/autotest_common.sh@734 -- # xtrace_disable 00:02:32.095 10:20:54 -- common/autotest_common.sh@10 -- # set +x 00:02:32.095 10:20:54 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:32.095 10:20:54 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:32.095 10:20:54 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:32.095 10:20:54 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:32.095 10:20:54 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:32.095 10:20:54 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:32.095 10:20:54 -- common/autotest_common.sh@1441 -- # uname 00:02:32.095 10:20:54 -- common/autotest_common.sh@1441 -- # '[' Linux = FreeBSD ']' 00:02:32.095 10:20:54 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:32.095 10:20:54 -- common/autotest_common.sh@1461 -- # uname 00:02:32.095 10:20:54 -- common/autotest_common.sh@1461 -- # [[ Linux = FreeBSD ]] 00:02:32.095 10:20:54 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:32.095 10:20:54 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=clang 00:02:32.095 10:20:54 -- spdk/autotest.sh@72 -- # hash lcov 00:02:32.095 10:20:54 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:02:32.095 10:20:54 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:02:32.095 10:20:54 -- common/autotest_common.sh@710 -- # xtrace_disable 00:02:32.095 10:20:54 -- common/autotest_common.sh@10 -- # set +x 00:02:32.095 10:20:54 -- spdk/autotest.sh@91 -- # rm -f 00:02:32.095 10:20:54 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:36.295 0000:5e:00.0 (144d a80a): Already using the nvme driver 00:02:36.295 0000:af:00.0 (8086 2701): Already using the nvme driver 00:02:36.295 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:b0:00.0 (8086 4140): Already using the nvme driver 00:02:36.295 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:36.295 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:36.295 10:20:58 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:02:36.295 10:20:58 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:02:36.295 10:20:58 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:02:36.295 10:20:58 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:02:36.295 10:20:58 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:36.295 10:20:58 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:02:36.295 10:20:58 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:02:36.295 10:20:58 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:36.295 10:20:58 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:36.295 10:20:58 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:36.295 10:20:58 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme1n1 00:02:36.295 10:20:58 -- common/autotest_common.sh@1648 -- # local device=nvme1n1 00:02:36.295 10:20:58 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:02:36.295 10:20:58 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:36.295 10:20:58 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:36.295 10:20:58 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme2n1 00:02:36.295 10:20:58 -- common/autotest_common.sh@1648 -- # local device=nvme2n1 00:02:36.295 10:20:58 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme2n1/queue/zoned ]] 00:02:36.295 10:20:58 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:36.295 10:20:58 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:02:36.295 10:20:58 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:36.295 10:20:58 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:36.295 10:20:58 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:02:36.295 10:20:58 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:02:36.295 10:20:58 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:36.295 No valid GPT data, bailing 00:02:36.295 10:20:58 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:36.295 10:20:58 -- scripts/common.sh@391 -- # pt= 00:02:36.295 10:20:58 -- scripts/common.sh@392 -- # return 1 00:02:36.295 10:20:58 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:36.295 1+0 records in 00:02:36.295 1+0 records out 00:02:36.295 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00180727 s, 580 MB/s 00:02:36.295 10:20:58 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:36.295 10:20:58 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:36.295 10:20:58 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme1n1 00:02:36.295 10:20:58 -- scripts/common.sh@378 -- # local block=/dev/nvme1n1 pt 00:02:36.295 10:20:58 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme1n1 00:02:36.295 No valid GPT data, bailing 00:02:36.295 10:20:58 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme1n1 00:02:36.295 10:20:58 -- scripts/common.sh@391 -- # pt= 00:02:36.295 10:20:58 -- scripts/common.sh@392 -- # return 1 00:02:36.295 10:20:58 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme1n1 bs=1M count=1 00:02:36.295 1+0 records in 00:02:36.295 1+0 records out 00:02:36.295 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00409635 s, 256 MB/s 00:02:36.295 10:20:58 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:36.295 10:20:58 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:36.295 10:20:58 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme2n1 00:02:36.295 10:20:58 -- scripts/common.sh@378 -- # local block=/dev/nvme2n1 pt 00:02:36.295 10:20:58 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme2n1 00:02:36.553 No valid GPT data, bailing 00:02:36.553 10:20:58 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme2n1 00:02:36.553 10:20:58 -- scripts/common.sh@391 -- # pt= 00:02:36.553 10:20:58 -- scripts/common.sh@392 -- # return 1 00:02:36.553 10:20:58 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme2n1 bs=1M count=1 00:02:36.553 1+0 records in 00:02:36.553 1+0 records out 00:02:36.553 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00407205 s, 258 MB/s 00:02:36.553 10:20:58 -- spdk/autotest.sh@118 -- # sync 00:02:36.553 10:20:58 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:36.553 10:20:58 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:36.553 10:20:58 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:41.829 10:21:03 -- spdk/autotest.sh@124 -- # uname -s 00:02:41.829 10:21:03 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:02:41.829 10:21:03 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:41.829 10:21:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:41.829 10:21:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:41.829 10:21:03 -- common/autotest_common.sh@10 -- # set +x 00:02:41.829 ************************************ 00:02:41.829 START TEST setup.sh 00:02:41.829 ************************************ 00:02:41.829 10:21:03 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:41.829 * Looking for test storage... 00:02:41.829 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:41.829 10:21:03 -- setup/test-setup.sh@10 -- # uname -s 00:02:41.829 10:21:03 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:41.829 10:21:03 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:41.829 10:21:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:41.829 10:21:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:41.829 10:21:03 -- common/autotest_common.sh@10 -- # set +x 00:02:41.829 ************************************ 00:02:41.829 START TEST acl 00:02:41.829 ************************************ 00:02:41.829 10:21:03 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:41.829 * Looking for test storage... 00:02:41.829 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:41.829 10:21:03 -- setup/acl.sh@10 -- # get_zoned_devs 00:02:41.829 10:21:03 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:02:41.829 10:21:03 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:02:41.829 10:21:03 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:02:41.829 10:21:03 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:41.829 10:21:03 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:02:41.829 10:21:03 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:02:41.829 10:21:03 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:41.829 10:21:03 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:41.829 10:21:03 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:41.829 10:21:03 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme1n1 00:02:41.829 10:21:03 -- common/autotest_common.sh@1648 -- # local device=nvme1n1 00:02:41.829 10:21:03 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:02:41.829 10:21:03 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:41.829 10:21:03 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:41.829 10:21:03 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme2n1 00:02:41.829 10:21:03 -- common/autotest_common.sh@1648 -- # local device=nvme2n1 00:02:41.829 10:21:03 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme2n1/queue/zoned ]] 00:02:41.829 10:21:03 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:41.829 10:21:03 -- setup/acl.sh@12 -- # devs=() 00:02:41.829 10:21:03 -- setup/acl.sh@12 -- # declare -a devs 00:02:41.829 10:21:03 -- setup/acl.sh@13 -- # drivers=() 00:02:41.829 10:21:03 -- setup/acl.sh@13 -- # declare -A drivers 00:02:41.829 10:21:03 -- setup/acl.sh@51 -- # setup reset 00:02:41.829 10:21:03 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:41.829 10:21:03 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:46.027 10:21:07 -- setup/acl.sh@52 -- # collect_setup_devs 00:02:46.027 10:21:07 -- setup/acl.sh@16 -- # local dev driver 00:02:46.027 10:21:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:46.027 10:21:07 -- setup/acl.sh@15 -- # setup output status 00:02:46.027 10:21:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:46.027 10:21:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:50.229 Hugepages 00:02:50.229 node hugesize free / total 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # continue 00:02:50.229 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # continue 00:02:50.229 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # continue 00:02:50.229 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.229 00:02:50.229 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # continue 00:02:50.229 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:50.229 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.229 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.229 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:50.229 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.229 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.229 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:50.229 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.229 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.229 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:50.229 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.229 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.229 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.229 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:50.230 10:21:11 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # continue 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:af:00.0 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\a\f\:\0\0\.\0* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:50.230 10:21:11 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@19 -- # [[ 0000:b0:00.0 == *:*:*.* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:50.230 10:21:11 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\b\0\:\0\0\.\0* ]] 00:02:50.230 10:21:11 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:50.230 10:21:11 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:50.230 10:21:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:50.230 10:21:11 -- setup/acl.sh@24 -- # (( 3 > 0 )) 00:02:50.230 10:21:11 -- setup/acl.sh@54 -- # run_test denied denied 00:02:50.230 10:21:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:50.230 10:21:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:50.230 10:21:11 -- common/autotest_common.sh@10 -- # set +x 00:02:50.230 ************************************ 00:02:50.230 START TEST denied 00:02:50.230 ************************************ 00:02:50.230 10:21:12 -- common/autotest_common.sh@1111 -- # denied 00:02:50.230 10:21:12 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:02:50.230 10:21:12 -- setup/acl.sh@38 -- # setup output config 00:02:50.230 10:21:12 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:02:50.230 10:21:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:50.230 10:21:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:54.432 0000:5e:00.0 (144d a80a): Skipping denied controller at 0000:5e:00.0 00:02:54.432 10:21:15 -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:02:54.432 10:21:15 -- setup/acl.sh@28 -- # local dev driver 00:02:54.432 10:21:15 -- setup/acl.sh@30 -- # for dev in "$@" 00:02:54.432 10:21:15 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:02:54.432 10:21:15 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:02:54.432 10:21:15 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:02:54.432 10:21:15 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:02:54.432 10:21:15 -- setup/acl.sh@41 -- # setup reset 00:02:54.432 10:21:15 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:54.432 10:21:15 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:59.745 00:02:59.745 real 0m8.927s 00:02:59.745 user 0m2.891s 00:02:59.745 sys 0m5.294s 00:02:59.745 10:21:20 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:02:59.745 10:21:20 -- common/autotest_common.sh@10 -- # set +x 00:02:59.745 ************************************ 00:02:59.745 END TEST denied 00:02:59.745 ************************************ 00:02:59.745 10:21:21 -- setup/acl.sh@55 -- # run_test allowed allowed 00:02:59.745 10:21:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:59.745 10:21:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:59.745 10:21:21 -- common/autotest_common.sh@10 -- # set +x 00:02:59.745 ************************************ 00:02:59.745 START TEST allowed 00:02:59.745 ************************************ 00:02:59.745 10:21:21 -- common/autotest_common.sh@1111 -- # allowed 00:02:59.745 10:21:21 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:02:59.745 10:21:21 -- setup/acl.sh@45 -- # setup output config 00:02:59.745 10:21:21 -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:02:59.745 10:21:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:59.745 10:21:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:05.029 0000:5e:00.0 (144d a80a): nvme -> vfio-pci 00:03:05.029 10:21:26 -- setup/acl.sh@47 -- # verify 0000:af:00.0 0000:b0:00.0 00:03:05.029 10:21:26 -- setup/acl.sh@28 -- # local dev driver 00:03:05.029 10:21:26 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:05.029 10:21:26 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:af:00.0 ]] 00:03:05.029 10:21:26 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:af:00.0/driver 00:03:05.029 10:21:26 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:05.029 10:21:26 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:05.029 10:21:26 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:05.029 10:21:26 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:b0:00.0 ]] 00:03:05.029 10:21:26 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:b0:00.0/driver 00:03:05.029 10:21:26 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:05.029 10:21:26 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:05.029 10:21:26 -- setup/acl.sh@48 -- # setup reset 00:03:05.029 10:21:26 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:05.029 10:21:26 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:09.243 00:03:09.243 real 0m9.611s 00:03:09.243 user 0m2.824s 00:03:09.243 sys 0m5.204s 00:03:09.243 10:21:30 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:09.243 10:21:30 -- common/autotest_common.sh@10 -- # set +x 00:03:09.243 ************************************ 00:03:09.243 END TEST allowed 00:03:09.243 ************************************ 00:03:09.243 00:03:09.243 real 0m27.008s 00:03:09.243 user 0m8.749s 00:03:09.243 sys 0m16.210s 00:03:09.243 10:21:30 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:09.243 10:21:30 -- common/autotest_common.sh@10 -- # set +x 00:03:09.243 ************************************ 00:03:09.243 END TEST acl 00:03:09.243 ************************************ 00:03:09.243 10:21:30 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:09.243 10:21:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:09.243 10:21:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:09.243 10:21:30 -- common/autotest_common.sh@10 -- # set +x 00:03:09.243 ************************************ 00:03:09.243 START TEST hugepages 00:03:09.243 ************************************ 00:03:09.243 10:21:30 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:09.243 * Looking for test storage... 00:03:09.243 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:09.243 10:21:31 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:09.243 10:21:31 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:09.243 10:21:31 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:09.243 10:21:31 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:09.243 10:21:31 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:09.243 10:21:31 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:09.243 10:21:31 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:09.243 10:21:31 -- setup/common.sh@18 -- # local node= 00:03:09.243 10:21:31 -- setup/common.sh@19 -- # local var val 00:03:09.243 10:21:31 -- setup/common.sh@20 -- # local mem_f mem 00:03:09.243 10:21:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.243 10:21:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.243 10:21:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.243 10:21:31 -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.243 10:21:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 43090704 kB' 'MemAvailable: 46664636 kB' 'Buffers: 9788 kB' 'Cached: 10840652 kB' 'SwapCached: 0 kB' 'Active: 8242352 kB' 'Inactive: 3417780 kB' 'Active(anon): 7676432 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 813176 kB' 'Mapped: 145044 kB' 'Shmem: 6866740 kB' 'KReclaimable: 174756 kB' 'Slab: 416864 kB' 'SReclaimable: 174756 kB' 'SUnreclaim: 242108 kB' 'KernelStack: 15952 kB' 'PageTables: 8664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439264 kB' 'Committed_AS: 9433632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198480 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.243 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.243 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.244 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.244 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # continue 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.245 10:21:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.245 10:21:31 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:09.245 10:21:31 -- setup/common.sh@33 -- # echo 2048 00:03:09.245 10:21:31 -- setup/common.sh@33 -- # return 0 00:03:09.245 10:21:31 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:09.245 10:21:31 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:09.245 10:21:31 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:09.245 10:21:31 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:09.245 10:21:31 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:09.245 10:21:31 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:09.245 10:21:31 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:09.245 10:21:31 -- setup/hugepages.sh@207 -- # get_nodes 00:03:09.245 10:21:31 -- setup/hugepages.sh@27 -- # local node 00:03:09.245 10:21:31 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:09.245 10:21:31 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:09.245 10:21:31 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:09.245 10:21:31 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:09.245 10:21:31 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:09.245 10:21:31 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:09.245 10:21:31 -- setup/hugepages.sh@208 -- # clear_hp 00:03:09.245 10:21:31 -- setup/hugepages.sh@37 -- # local node hp 00:03:09.245 10:21:31 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:09.245 10:21:31 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:09.245 10:21:31 -- setup/hugepages.sh@41 -- # echo 0 00:03:09.245 10:21:31 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:09.245 10:21:31 -- setup/hugepages.sh@41 -- # echo 0 00:03:09.245 10:21:31 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:09.245 10:21:31 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:09.245 10:21:31 -- setup/hugepages.sh@41 -- # echo 0 00:03:09.245 10:21:31 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:09.245 10:21:31 -- setup/hugepages.sh@41 -- # echo 0 00:03:09.245 10:21:31 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:09.245 10:21:31 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:09.245 10:21:31 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:09.245 10:21:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:09.245 10:21:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:09.245 10:21:31 -- common/autotest_common.sh@10 -- # set +x 00:03:09.245 ************************************ 00:03:09.245 START TEST default_setup 00:03:09.245 ************************************ 00:03:09.245 10:21:31 -- common/autotest_common.sh@1111 -- # default_setup 00:03:09.245 10:21:31 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:09.245 10:21:31 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:09.245 10:21:31 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:09.245 10:21:31 -- setup/hugepages.sh@51 -- # shift 00:03:09.245 10:21:31 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:09.245 10:21:31 -- setup/hugepages.sh@52 -- # local node_ids 00:03:09.245 10:21:31 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:09.245 10:21:31 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:09.245 10:21:31 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:09.245 10:21:31 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:09.245 10:21:31 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:09.245 10:21:31 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:09.245 10:21:31 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:09.245 10:21:31 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:09.245 10:21:31 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:09.245 10:21:31 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:09.245 10:21:31 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:09.245 10:21:31 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:09.245 10:21:31 -- setup/hugepages.sh@73 -- # return 0 00:03:09.245 10:21:31 -- setup/hugepages.sh@137 -- # setup output 00:03:09.245 10:21:31 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:09.245 10:21:31 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:13.449 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:af:00.0 (8086 2701): nvme -> vfio-pci 00:03:13.449 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:5e:00.0 (144d a80a): nvme -> vfio-pci 00:03:13.449 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:b0:00.0 (8086 4140): nvme -> vfio-pci 00:03:13.449 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:13.449 10:21:35 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:13.449 10:21:35 -- setup/hugepages.sh@89 -- # local node 00:03:13.449 10:21:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:13.449 10:21:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:13.449 10:21:35 -- setup/hugepages.sh@92 -- # local surp 00:03:13.449 10:21:35 -- setup/hugepages.sh@93 -- # local resv 00:03:13.449 10:21:35 -- setup/hugepages.sh@94 -- # local anon 00:03:13.449 10:21:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:13.449 10:21:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:13.449 10:21:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:13.449 10:21:35 -- setup/common.sh@18 -- # local node= 00:03:13.449 10:21:35 -- setup/common.sh@19 -- # local var val 00:03:13.449 10:21:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.449 10:21:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.449 10:21:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.449 10:21:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.449 10:21:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.449 10:21:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45317004 kB' 'MemAvailable: 48889748 kB' 'Buffers: 9788 kB' 'Cached: 10840736 kB' 'SwapCached: 0 kB' 'Active: 8257648 kB' 'Inactive: 3417780 kB' 'Active(anon): 7691728 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 828112 kB' 'Mapped: 145640 kB' 'Shmem: 6866824 kB' 'KReclaimable: 172380 kB' 'Slab: 411364 kB' 'SReclaimable: 172380 kB' 'SUnreclaim: 238984 kB' 'KernelStack: 15984 kB' 'PageTables: 8720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9453588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198576 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.449 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.449 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.450 10:21:35 -- setup/common.sh@33 -- # echo 0 00:03:13.450 10:21:35 -- setup/common.sh@33 -- # return 0 00:03:13.450 10:21:35 -- setup/hugepages.sh@97 -- # anon=0 00:03:13.450 10:21:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:13.450 10:21:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.450 10:21:35 -- setup/common.sh@18 -- # local node= 00:03:13.450 10:21:35 -- setup/common.sh@19 -- # local var val 00:03:13.450 10:21:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.450 10:21:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.450 10:21:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.450 10:21:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.450 10:21:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.450 10:21:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45318712 kB' 'MemAvailable: 48891456 kB' 'Buffers: 9788 kB' 'Cached: 10840740 kB' 'SwapCached: 0 kB' 'Active: 8257604 kB' 'Inactive: 3417780 kB' 'Active(anon): 7691684 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 828124 kB' 'Mapped: 145640 kB' 'Shmem: 6866828 kB' 'KReclaimable: 172380 kB' 'Slab: 411336 kB' 'SReclaimable: 172380 kB' 'SUnreclaim: 238956 kB' 'KernelStack: 15952 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9453600 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198544 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.450 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.450 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.451 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.451 10:21:35 -- setup/common.sh@33 -- # echo 0 00:03:13.451 10:21:35 -- setup/common.sh@33 -- # return 0 00:03:13.451 10:21:35 -- setup/hugepages.sh@99 -- # surp=0 00:03:13.451 10:21:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:13.451 10:21:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:13.451 10:21:35 -- setup/common.sh@18 -- # local node= 00:03:13.451 10:21:35 -- setup/common.sh@19 -- # local var val 00:03:13.451 10:21:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.451 10:21:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.451 10:21:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.451 10:21:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.451 10:21:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.451 10:21:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.451 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45323084 kB' 'MemAvailable: 48895828 kB' 'Buffers: 9788 kB' 'Cached: 10840752 kB' 'SwapCached: 0 kB' 'Active: 8257068 kB' 'Inactive: 3417780 kB' 'Active(anon): 7691148 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 827592 kB' 'Mapped: 145620 kB' 'Shmem: 6866840 kB' 'KReclaimable: 172380 kB' 'Slab: 411304 kB' 'SReclaimable: 172380 kB' 'SUnreclaim: 238924 kB' 'KernelStack: 15968 kB' 'PageTables: 8648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9453616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198544 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.452 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.452 10:21:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.453 10:21:35 -- setup/common.sh@33 -- # echo 0 00:03:13.453 10:21:35 -- setup/common.sh@33 -- # return 0 00:03:13.453 10:21:35 -- setup/hugepages.sh@100 -- # resv=0 00:03:13.453 10:21:35 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:13.453 nr_hugepages=1024 00:03:13.453 10:21:35 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:13.453 resv_hugepages=0 00:03:13.453 10:21:35 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:13.453 surplus_hugepages=0 00:03:13.453 10:21:35 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:13.453 anon_hugepages=0 00:03:13.453 10:21:35 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:13.453 10:21:35 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:13.453 10:21:35 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:13.453 10:21:35 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:13.453 10:21:35 -- setup/common.sh@18 -- # local node= 00:03:13.453 10:21:35 -- setup/common.sh@19 -- # local var val 00:03:13.453 10:21:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.453 10:21:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.453 10:21:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.453 10:21:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.453 10:21:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.453 10:21:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.453 10:21:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45323084 kB' 'MemAvailable: 48895828 kB' 'Buffers: 9788 kB' 'Cached: 10840776 kB' 'SwapCached: 0 kB' 'Active: 8257080 kB' 'Inactive: 3417780 kB' 'Active(anon): 7691160 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 827548 kB' 'Mapped: 145620 kB' 'Shmem: 6866864 kB' 'KReclaimable: 172380 kB' 'Slab: 411304 kB' 'SReclaimable: 172380 kB' 'SUnreclaim: 238924 kB' 'KernelStack: 15968 kB' 'PageTables: 8644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9453632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198544 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.453 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.453 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.454 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.454 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.454 10:21:35 -- setup/common.sh@33 -- # echo 1024 00:03:13.454 10:21:35 -- setup/common.sh@33 -- # return 0 00:03:13.454 10:21:35 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:13.454 10:21:35 -- setup/hugepages.sh@112 -- # get_nodes 00:03:13.454 10:21:35 -- setup/hugepages.sh@27 -- # local node 00:03:13.454 10:21:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:13.454 10:21:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:13.454 10:21:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:13.454 10:21:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:13.454 10:21:35 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:13.455 10:21:35 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:13.455 10:21:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:13.455 10:21:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:13.455 10:21:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:13.455 10:21:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.455 10:21:35 -- setup/common.sh@18 -- # local node=0 00:03:13.455 10:21:35 -- setup/common.sh@19 -- # local var val 00:03:13.455 10:21:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.455 10:21:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.455 10:21:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:13.455 10:21:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:13.455 10:21:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.455 10:21:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634128 kB' 'MemFree: 20872700 kB' 'MemUsed: 11761428 kB' 'SwapCached: 0 kB' 'Active: 5714380 kB' 'Inactive: 3357500 kB' 'Active(anon): 5310376 kB' 'Inactive(anon): 0 kB' 'Active(file): 404004 kB' 'Inactive(file): 3357500 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8597956 kB' 'Mapped: 50104 kB' 'AnonPages: 477072 kB' 'Shmem: 4836452 kB' 'KernelStack: 9400 kB' 'PageTables: 5264 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 99932 kB' 'Slab: 230560 kB' 'SReclaimable: 99932 kB' 'SUnreclaim: 130628 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.455 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.455 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.456 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.456 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.456 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.456 10:21:35 -- setup/common.sh@32 -- # continue 00:03:13.456 10:21:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.456 10:21:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.456 10:21:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.456 10:21:35 -- setup/common.sh@33 -- # echo 0 00:03:13.456 10:21:35 -- setup/common.sh@33 -- # return 0 00:03:13.456 10:21:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:13.456 10:21:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:13.456 10:21:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:13.456 10:21:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:13.456 10:21:35 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:13.456 node0=1024 expecting 1024 00:03:13.456 10:21:35 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:13.456 00:03:13.456 real 0m3.992s 00:03:13.456 user 0m1.491s 00:03:13.456 sys 0m2.573s 00:03:13.456 10:21:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:13.456 10:21:35 -- common/autotest_common.sh@10 -- # set +x 00:03:13.456 ************************************ 00:03:13.456 END TEST default_setup 00:03:13.456 ************************************ 00:03:13.456 10:21:35 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:13.456 10:21:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:13.456 10:21:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:13.456 10:21:35 -- common/autotest_common.sh@10 -- # set +x 00:03:13.456 ************************************ 00:03:13.456 START TEST per_node_1G_alloc 00:03:13.456 ************************************ 00:03:13.456 10:21:35 -- common/autotest_common.sh@1111 -- # per_node_1G_alloc 00:03:13.456 10:21:35 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:13.456 10:21:35 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:13.456 10:21:35 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:13.456 10:21:35 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:13.456 10:21:35 -- setup/hugepages.sh@51 -- # shift 00:03:13.456 10:21:35 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:13.456 10:21:35 -- setup/hugepages.sh@52 -- # local node_ids 00:03:13.456 10:21:35 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:13.456 10:21:35 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:13.456 10:21:35 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:13.456 10:21:35 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:13.456 10:21:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:13.456 10:21:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:13.456 10:21:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:13.456 10:21:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:13.456 10:21:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:13.456 10:21:35 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:13.456 10:21:35 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:13.456 10:21:35 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:13.456 10:21:35 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:13.456 10:21:35 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:13.456 10:21:35 -- setup/hugepages.sh@73 -- # return 0 00:03:13.456 10:21:35 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:13.456 10:21:35 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:13.456 10:21:35 -- setup/hugepages.sh@146 -- # setup output 00:03:13.456 10:21:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:13.456 10:21:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:16.750 0000:5e:00.0 (144d a80a): Already using the vfio-pci driver 00:03:16.750 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:af:00.0 (8086 2701): Already using the vfio-pci driver 00:03:16.750 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:b0:00.0 (8086 4140): Already using the vfio-pci driver 00:03:16.750 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:16.750 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:17.014 10:21:38 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:17.014 10:21:38 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:17.014 10:21:38 -- setup/hugepages.sh@89 -- # local node 00:03:17.014 10:21:38 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:17.014 10:21:38 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:17.014 10:21:38 -- setup/hugepages.sh@92 -- # local surp 00:03:17.014 10:21:38 -- setup/hugepages.sh@93 -- # local resv 00:03:17.014 10:21:38 -- setup/hugepages.sh@94 -- # local anon 00:03:17.014 10:21:38 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:17.014 10:21:38 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:17.014 10:21:38 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:17.014 10:21:38 -- setup/common.sh@18 -- # local node= 00:03:17.014 10:21:38 -- setup/common.sh@19 -- # local var val 00:03:17.014 10:21:38 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.014 10:21:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.014 10:21:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.014 10:21:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.014 10:21:38 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.014 10:21:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.014 10:21:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45330360 kB' 'MemAvailable: 48903120 kB' 'Buffers: 9788 kB' 'Cached: 10840848 kB' 'SwapCached: 0 kB' 'Active: 8254868 kB' 'Inactive: 3417780 kB' 'Active(anon): 7688948 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 824752 kB' 'Mapped: 144180 kB' 'Shmem: 6866936 kB' 'KReclaimable: 172412 kB' 'Slab: 411172 kB' 'SReclaimable: 172412 kB' 'SUnreclaim: 238760 kB' 'KernelStack: 16032 kB' 'PageTables: 7844 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9442920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.014 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.014 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.015 10:21:38 -- setup/common.sh@33 -- # echo 0 00:03:17.015 10:21:38 -- setup/common.sh@33 -- # return 0 00:03:17.015 10:21:38 -- setup/hugepages.sh@97 -- # anon=0 00:03:17.015 10:21:38 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:17.015 10:21:38 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:17.015 10:21:38 -- setup/common.sh@18 -- # local node= 00:03:17.015 10:21:38 -- setup/common.sh@19 -- # local var val 00:03:17.015 10:21:38 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.015 10:21:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.015 10:21:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.015 10:21:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.015 10:21:38 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.015 10:21:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45329260 kB' 'MemAvailable: 48902020 kB' 'Buffers: 9788 kB' 'Cached: 10840848 kB' 'SwapCached: 0 kB' 'Active: 8255092 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689172 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 824996 kB' 'Mapped: 144176 kB' 'Shmem: 6866936 kB' 'KReclaimable: 172412 kB' 'Slab: 411140 kB' 'SReclaimable: 172412 kB' 'SUnreclaim: 238728 kB' 'KernelStack: 16032 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9442932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198704 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.015 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.015 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.016 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.016 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.016 10:21:39 -- setup/common.sh@33 -- # echo 0 00:03:17.016 10:21:39 -- setup/common.sh@33 -- # return 0 00:03:17.016 10:21:39 -- setup/hugepages.sh@99 -- # surp=0 00:03:17.016 10:21:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:17.016 10:21:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:17.016 10:21:39 -- setup/common.sh@18 -- # local node= 00:03:17.016 10:21:39 -- setup/common.sh@19 -- # local var val 00:03:17.016 10:21:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.016 10:21:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.016 10:21:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.016 10:21:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.016 10:21:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.016 10:21:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.017 10:21:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45329324 kB' 'MemAvailable: 48902084 kB' 'Buffers: 9788 kB' 'Cached: 10840860 kB' 'SwapCached: 0 kB' 'Active: 8254428 kB' 'Inactive: 3417780 kB' 'Active(anon): 7688508 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 824728 kB' 'Mapped: 144100 kB' 'Shmem: 6866948 kB' 'KReclaimable: 172412 kB' 'Slab: 411184 kB' 'SReclaimable: 172412 kB' 'SUnreclaim: 238772 kB' 'KernelStack: 15936 kB' 'PageTables: 8076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9441552 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198640 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.017 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.017 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.018 10:21:39 -- setup/common.sh@33 -- # echo 0 00:03:17.018 10:21:39 -- setup/common.sh@33 -- # return 0 00:03:17.018 10:21:39 -- setup/hugepages.sh@100 -- # resv=0 00:03:17.018 10:21:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:17.018 nr_hugepages=1024 00:03:17.018 10:21:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:17.018 resv_hugepages=0 00:03:17.018 10:21:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:17.018 surplus_hugepages=0 00:03:17.018 10:21:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:17.018 anon_hugepages=0 00:03:17.018 10:21:39 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:17.018 10:21:39 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:17.018 10:21:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:17.018 10:21:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:17.018 10:21:39 -- setup/common.sh@18 -- # local node= 00:03:17.018 10:21:39 -- setup/common.sh@19 -- # local var val 00:03:17.018 10:21:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.018 10:21:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.018 10:21:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.018 10:21:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.018 10:21:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.018 10:21:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45328820 kB' 'MemAvailable: 48901580 kB' 'Buffers: 9788 kB' 'Cached: 10840888 kB' 'SwapCached: 0 kB' 'Active: 8254624 kB' 'Inactive: 3417780 kB' 'Active(anon): 7688704 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 824884 kB' 'Mapped: 144100 kB' 'Shmem: 6866976 kB' 'KReclaimable: 172412 kB' 'Slab: 411184 kB' 'SReclaimable: 172412 kB' 'SUnreclaim: 238772 kB' 'KernelStack: 15968 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9442960 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.018 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.018 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.019 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.019 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.019 10:21:39 -- setup/common.sh@33 -- # echo 1024 00:03:17.019 10:21:39 -- setup/common.sh@33 -- # return 0 00:03:17.019 10:21:39 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:17.019 10:21:39 -- setup/hugepages.sh@112 -- # get_nodes 00:03:17.019 10:21:39 -- setup/hugepages.sh@27 -- # local node 00:03:17.019 10:21:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:17.019 10:21:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:17.019 10:21:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:17.019 10:21:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:17.019 10:21:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:17.019 10:21:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:17.019 10:21:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:17.019 10:21:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:17.019 10:21:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:17.019 10:21:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:17.019 10:21:39 -- setup/common.sh@18 -- # local node=0 00:03:17.020 10:21:39 -- setup/common.sh@19 -- # local var val 00:03:17.020 10:21:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.020 10:21:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.020 10:21:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:17.020 10:21:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:17.020 10:21:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.020 10:21:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634128 kB' 'MemFree: 21920704 kB' 'MemUsed: 10713424 kB' 'SwapCached: 0 kB' 'Active: 5713628 kB' 'Inactive: 3357500 kB' 'Active(anon): 5309624 kB' 'Inactive(anon): 0 kB' 'Active(file): 404004 kB' 'Inactive(file): 3357500 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8598056 kB' 'Mapped: 48588 kB' 'AnonPages: 476260 kB' 'Shmem: 4836552 kB' 'KernelStack: 9320 kB' 'PageTables: 4888 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 99964 kB' 'Slab: 230500 kB' 'SReclaimable: 99964 kB' 'SUnreclaim: 130536 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.020 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.020 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@33 -- # echo 0 00:03:17.281 10:21:39 -- setup/common.sh@33 -- # return 0 00:03:17.281 10:21:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:17.281 10:21:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:17.281 10:21:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:17.281 10:21:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:17.281 10:21:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:17.281 10:21:39 -- setup/common.sh@18 -- # local node=1 00:03:17.281 10:21:39 -- setup/common.sh@19 -- # local var val 00:03:17.281 10:21:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.281 10:21:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.281 10:21:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:17.281 10:21:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:17.281 10:21:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.281 10:21:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.281 10:21:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27661496 kB' 'MemFree: 23411808 kB' 'MemUsed: 4249688 kB' 'SwapCached: 0 kB' 'Active: 2541876 kB' 'Inactive: 60280 kB' 'Active(anon): 2379960 kB' 'Inactive(anon): 0 kB' 'Active(file): 161916 kB' 'Inactive(file): 60280 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2252624 kB' 'Mapped: 95512 kB' 'AnonPages: 349664 kB' 'Shmem: 2030428 kB' 'KernelStack: 6776 kB' 'PageTables: 3884 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 72448 kB' 'Slab: 180920 kB' 'SReclaimable: 72448 kB' 'SUnreclaim: 108472 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.281 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.281 10:21:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # continue 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.282 10:21:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.282 10:21:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.282 10:21:39 -- setup/common.sh@33 -- # echo 0 00:03:17.282 10:21:39 -- setup/common.sh@33 -- # return 0 00:03:17.282 10:21:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:17.282 10:21:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:17.282 10:21:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:17.282 10:21:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:17.282 10:21:39 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:17.282 node0=512 expecting 512 00:03:17.282 10:21:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:17.282 10:21:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:17.282 10:21:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:17.282 10:21:39 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:17.282 node1=512 expecting 512 00:03:17.282 10:21:39 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:17.282 00:03:17.282 real 0m3.741s 00:03:17.282 user 0m1.425s 00:03:17.282 sys 0m2.374s 00:03:17.282 10:21:39 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:17.282 10:21:39 -- common/autotest_common.sh@10 -- # set +x 00:03:17.282 ************************************ 00:03:17.282 END TEST per_node_1G_alloc 00:03:17.282 ************************************ 00:03:17.282 10:21:39 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:17.282 10:21:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:17.282 10:21:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:17.282 10:21:39 -- common/autotest_common.sh@10 -- # set +x 00:03:17.282 ************************************ 00:03:17.282 START TEST even_2G_alloc 00:03:17.282 ************************************ 00:03:17.282 10:21:39 -- common/autotest_common.sh@1111 -- # even_2G_alloc 00:03:17.282 10:21:39 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:17.282 10:21:39 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:17.282 10:21:39 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:17.282 10:21:39 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:17.282 10:21:39 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:17.282 10:21:39 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:17.282 10:21:39 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:17.282 10:21:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:17.282 10:21:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:17.282 10:21:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:17.282 10:21:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:17.282 10:21:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:17.282 10:21:39 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:17.282 10:21:39 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:17.282 10:21:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:17.282 10:21:39 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:17.282 10:21:39 -- setup/hugepages.sh@83 -- # : 512 00:03:17.282 10:21:39 -- setup/hugepages.sh@84 -- # : 1 00:03:17.282 10:21:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:17.282 10:21:39 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:17.282 10:21:39 -- setup/hugepages.sh@83 -- # : 0 00:03:17.282 10:21:39 -- setup/hugepages.sh@84 -- # : 0 00:03:17.282 10:21:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:17.282 10:21:39 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:17.282 10:21:39 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:17.282 10:21:39 -- setup/hugepages.sh@153 -- # setup output 00:03:17.282 10:21:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:17.282 10:21:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:20.580 0000:5e:00.0 (144d a80a): Already using the vfio-pci driver 00:03:20.580 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:20.580 0000:af:00.0 (8086 2701): Already using the vfio-pci driver 00:03:20.841 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:b0:00.0 (8086 4140): Already using the vfio-pci driver 00:03:20.841 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:20.841 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:20.841 10:21:42 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:20.841 10:21:42 -- setup/hugepages.sh@89 -- # local node 00:03:20.841 10:21:42 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:20.841 10:21:42 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:20.841 10:21:42 -- setup/hugepages.sh@92 -- # local surp 00:03:20.841 10:21:42 -- setup/hugepages.sh@93 -- # local resv 00:03:20.841 10:21:42 -- setup/hugepages.sh@94 -- # local anon 00:03:20.841 10:21:42 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:20.841 10:21:42 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:20.841 10:21:42 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:20.841 10:21:42 -- setup/common.sh@18 -- # local node= 00:03:20.841 10:21:42 -- setup/common.sh@19 -- # local var val 00:03:20.841 10:21:42 -- setup/common.sh@20 -- # local mem_f mem 00:03:20.841 10:21:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:20.841 10:21:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:20.841 10:21:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:20.841 10:21:42 -- setup/common.sh@28 -- # mapfile -t mem 00:03:20.841 10:21:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:20.841 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.841 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.841 10:21:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45337848 kB' 'MemAvailable: 48910776 kB' 'Buffers: 9788 kB' 'Cached: 10840960 kB' 'SwapCached: 0 kB' 'Active: 8252760 kB' 'Inactive: 3417780 kB' 'Active(anon): 7686840 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 822636 kB' 'Mapped: 144224 kB' 'Shmem: 6867048 kB' 'KReclaimable: 172748 kB' 'Slab: 411828 kB' 'SReclaimable: 172748 kB' 'SUnreclaim: 239080 kB' 'KernelStack: 16048 kB' 'PageTables: 8224 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9443428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:20.841 10:21:42 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:20.841 10:21:42 -- setup/common.sh@32 -- # continue 00:03:20.841 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.841 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.841 10:21:42 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:20.841 10:21:42 -- setup/common.sh@32 -- # continue 00:03:20.841 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.841 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.841 10:21:42 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:20.841 10:21:42 -- setup/common.sh@32 -- # continue 00:03:20.841 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.106 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.106 10:21:42 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.107 10:21:42 -- setup/common.sh@33 -- # echo 0 00:03:21.107 10:21:42 -- setup/common.sh@33 -- # return 0 00:03:21.107 10:21:42 -- setup/hugepages.sh@97 -- # anon=0 00:03:21.107 10:21:42 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:21.107 10:21:42 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:21.107 10:21:42 -- setup/common.sh@18 -- # local node= 00:03:21.107 10:21:42 -- setup/common.sh@19 -- # local var val 00:03:21.107 10:21:42 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.107 10:21:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.107 10:21:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.107 10:21:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.107 10:21:42 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.107 10:21:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45338296 kB' 'MemAvailable: 48911208 kB' 'Buffers: 9788 kB' 'Cached: 10840964 kB' 'SwapCached: 0 kB' 'Active: 8252280 kB' 'Inactive: 3417780 kB' 'Active(anon): 7686360 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 822108 kB' 'Mapped: 144200 kB' 'Shmem: 6867052 kB' 'KReclaimable: 172716 kB' 'Slab: 411772 kB' 'SReclaimable: 172716 kB' 'SUnreclaim: 239056 kB' 'KernelStack: 16080 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9443440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.107 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.107 10:21:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.108 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.108 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:42 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.109 10:21:42 -- setup/common.sh@33 -- # echo 0 00:03:21.109 10:21:42 -- setup/common.sh@33 -- # return 0 00:03:21.109 10:21:42 -- setup/hugepages.sh@99 -- # surp=0 00:03:21.109 10:21:42 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:21.109 10:21:42 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:21.109 10:21:42 -- setup/common.sh@18 -- # local node= 00:03:21.109 10:21:42 -- setup/common.sh@19 -- # local var val 00:03:21.109 10:21:42 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.109 10:21:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.109 10:21:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.109 10:21:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.109 10:21:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.109 10:21:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45338744 kB' 'MemAvailable: 48911656 kB' 'Buffers: 9788 kB' 'Cached: 10840964 kB' 'SwapCached: 0 kB' 'Active: 8251744 kB' 'Inactive: 3417780 kB' 'Active(anon): 7685824 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 822048 kB' 'Mapped: 144116 kB' 'Shmem: 6867052 kB' 'KReclaimable: 172716 kB' 'Slab: 411788 kB' 'SReclaimable: 172716 kB' 'SUnreclaim: 239072 kB' 'KernelStack: 15952 kB' 'PageTables: 8384 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9442060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198768 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.109 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.109 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.110 10:21:43 -- setup/common.sh@33 -- # echo 0 00:03:21.110 10:21:43 -- setup/common.sh@33 -- # return 0 00:03:21.110 10:21:43 -- setup/hugepages.sh@100 -- # resv=0 00:03:21.110 10:21:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:21.110 nr_hugepages=1024 00:03:21.110 10:21:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:21.110 resv_hugepages=0 00:03:21.110 10:21:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:21.110 surplus_hugepages=0 00:03:21.110 10:21:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:21.110 anon_hugepages=0 00:03:21.110 10:21:43 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:21.110 10:21:43 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:21.110 10:21:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:21.110 10:21:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:21.110 10:21:43 -- setup/common.sh@18 -- # local node= 00:03:21.110 10:21:43 -- setup/common.sh@19 -- # local var val 00:03:21.110 10:21:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.110 10:21:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.110 10:21:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.110 10:21:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.110 10:21:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.110 10:21:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45339780 kB' 'MemAvailable: 48912692 kB' 'Buffers: 9788 kB' 'Cached: 10840988 kB' 'SwapCached: 0 kB' 'Active: 8252148 kB' 'Inactive: 3417780 kB' 'Active(anon): 7686228 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 821992 kB' 'Mapped: 144124 kB' 'Shmem: 6867076 kB' 'KReclaimable: 172716 kB' 'Slab: 411788 kB' 'SReclaimable: 172716 kB' 'SUnreclaim: 239072 kB' 'KernelStack: 16032 kB' 'PageTables: 8064 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9441704 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198704 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.110 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.110 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.111 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.111 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.112 10:21:43 -- setup/common.sh@33 -- # echo 1024 00:03:21.112 10:21:43 -- setup/common.sh@33 -- # return 0 00:03:21.112 10:21:43 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:21.112 10:21:43 -- setup/hugepages.sh@112 -- # get_nodes 00:03:21.112 10:21:43 -- setup/hugepages.sh@27 -- # local node 00:03:21.112 10:21:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:21.112 10:21:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:21.112 10:21:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:21.112 10:21:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:21.112 10:21:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:21.112 10:21:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:21.112 10:21:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:21.112 10:21:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:21.112 10:21:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:21.112 10:21:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:21.112 10:21:43 -- setup/common.sh@18 -- # local node=0 00:03:21.112 10:21:43 -- setup/common.sh@19 -- # local var val 00:03:21.112 10:21:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.112 10:21:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.112 10:21:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:21.112 10:21:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:21.112 10:21:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.112 10:21:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.112 10:21:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634128 kB' 'MemFree: 21918380 kB' 'MemUsed: 10715748 kB' 'SwapCached: 0 kB' 'Active: 5710680 kB' 'Inactive: 3357500 kB' 'Active(anon): 5306676 kB' 'Inactive(anon): 0 kB' 'Active(file): 404004 kB' 'Inactive(file): 3357500 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8598152 kB' 'Mapped: 48440 kB' 'AnonPages: 473248 kB' 'Shmem: 4836648 kB' 'KernelStack: 9288 kB' 'PageTables: 4736 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 100284 kB' 'Slab: 230864 kB' 'SReclaimable: 100284 kB' 'SUnreclaim: 130580 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.112 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.112 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@33 -- # echo 0 00:03:21.113 10:21:43 -- setup/common.sh@33 -- # return 0 00:03:21.113 10:21:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:21.113 10:21:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:21.113 10:21:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:21.113 10:21:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:21.113 10:21:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:21.113 10:21:43 -- setup/common.sh@18 -- # local node=1 00:03:21.113 10:21:43 -- setup/common.sh@19 -- # local var val 00:03:21.113 10:21:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.113 10:21:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.113 10:21:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:21.113 10:21:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:21.113 10:21:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.113 10:21:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27661496 kB' 'MemFree: 23420168 kB' 'MemUsed: 4241328 kB' 'SwapCached: 0 kB' 'Active: 2540820 kB' 'Inactive: 60280 kB' 'Active(anon): 2378904 kB' 'Inactive(anon): 0 kB' 'Active(file): 161916 kB' 'Inactive(file): 60280 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2252648 kB' 'Mapped: 95512 kB' 'AnonPages: 348576 kB' 'Shmem: 2030452 kB' 'KernelStack: 6728 kB' 'PageTables: 3484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 72432 kB' 'Slab: 180792 kB' 'SReclaimable: 72432 kB' 'SUnreclaim: 108360 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.113 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.113 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # continue 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.114 10:21:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.114 10:21:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.114 10:21:43 -- setup/common.sh@33 -- # echo 0 00:03:21.114 10:21:43 -- setup/common.sh@33 -- # return 0 00:03:21.114 10:21:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:21.114 10:21:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:21.114 10:21:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:21.114 10:21:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:21.114 10:21:43 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:21.114 node0=512 expecting 512 00:03:21.114 10:21:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:21.114 10:21:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:21.114 10:21:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:21.114 10:21:43 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:21.114 node1=512 expecting 512 00:03:21.114 10:21:43 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:21.114 00:03:21.114 real 0m3.805s 00:03:21.114 user 0m1.428s 00:03:21.114 sys 0m2.407s 00:03:21.114 10:21:43 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:21.114 10:21:43 -- common/autotest_common.sh@10 -- # set +x 00:03:21.114 ************************************ 00:03:21.114 END TEST even_2G_alloc 00:03:21.114 ************************************ 00:03:21.114 10:21:43 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:21.114 10:21:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:21.114 10:21:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:21.114 10:21:43 -- common/autotest_common.sh@10 -- # set +x 00:03:21.374 ************************************ 00:03:21.374 START TEST odd_alloc 00:03:21.374 ************************************ 00:03:21.374 10:21:43 -- common/autotest_common.sh@1111 -- # odd_alloc 00:03:21.374 10:21:43 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:21.374 10:21:43 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:21.374 10:21:43 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:21.374 10:21:43 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:21.374 10:21:43 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:21.374 10:21:43 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:21.374 10:21:43 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:21.374 10:21:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:21.374 10:21:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:21.374 10:21:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:21.374 10:21:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:21.374 10:21:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:21.374 10:21:43 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:21.375 10:21:43 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:21.375 10:21:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:21.375 10:21:43 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:21.375 10:21:43 -- setup/hugepages.sh@83 -- # : 513 00:03:21.375 10:21:43 -- setup/hugepages.sh@84 -- # : 1 00:03:21.375 10:21:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:21.375 10:21:43 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:21.375 10:21:43 -- setup/hugepages.sh@83 -- # : 0 00:03:21.375 10:21:43 -- setup/hugepages.sh@84 -- # : 0 00:03:21.375 10:21:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:21.375 10:21:43 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:21.375 10:21:43 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:21.375 10:21:43 -- setup/hugepages.sh@160 -- # setup output 00:03:21.375 10:21:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:21.375 10:21:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:24.678 0000:5e:00.0 (144d a80a): Already using the vfio-pci driver 00:03:24.678 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:af:00.0 (8086 2701): Already using the vfio-pci driver 00:03:24.678 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:b0:00.0 (8086 4140): Already using the vfio-pci driver 00:03:24.678 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:24.678 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:24.944 10:21:46 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:24.944 10:21:46 -- setup/hugepages.sh@89 -- # local node 00:03:24.944 10:21:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:24.944 10:21:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:24.944 10:21:46 -- setup/hugepages.sh@92 -- # local surp 00:03:24.944 10:21:46 -- setup/hugepages.sh@93 -- # local resv 00:03:24.944 10:21:46 -- setup/hugepages.sh@94 -- # local anon 00:03:24.944 10:21:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:24.944 10:21:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:24.944 10:21:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:24.944 10:21:46 -- setup/common.sh@18 -- # local node= 00:03:24.944 10:21:46 -- setup/common.sh@19 -- # local var val 00:03:24.944 10:21:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.944 10:21:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.944 10:21:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.944 10:21:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.944 10:21:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.944 10:21:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45344080 kB' 'MemAvailable: 48916992 kB' 'Buffers: 9788 kB' 'Cached: 10841080 kB' 'SwapCached: 0 kB' 'Active: 8253636 kB' 'Inactive: 3417780 kB' 'Active(anon): 7687716 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 823800 kB' 'Mapped: 144160 kB' 'Shmem: 6867168 kB' 'KReclaimable: 172716 kB' 'Slab: 411704 kB' 'SReclaimable: 172716 kB' 'SUnreclaim: 238988 kB' 'KernelStack: 15952 kB' 'PageTables: 8224 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486816 kB' 'Committed_AS: 9441876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198704 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.944 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.944 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.945 10:21:46 -- setup/common.sh@33 -- # echo 0 00:03:24.945 10:21:46 -- setup/common.sh@33 -- # return 0 00:03:24.945 10:21:46 -- setup/hugepages.sh@97 -- # anon=0 00:03:24.945 10:21:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:24.945 10:21:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:24.945 10:21:46 -- setup/common.sh@18 -- # local node= 00:03:24.945 10:21:46 -- setup/common.sh@19 -- # local var val 00:03:24.945 10:21:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.945 10:21:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.945 10:21:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.945 10:21:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.945 10:21:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.945 10:21:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45343276 kB' 'MemAvailable: 48916188 kB' 'Buffers: 9788 kB' 'Cached: 10841084 kB' 'SwapCached: 0 kB' 'Active: 8253360 kB' 'Inactive: 3417780 kB' 'Active(anon): 7687440 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 823556 kB' 'Mapped: 144148 kB' 'Shmem: 6867172 kB' 'KReclaimable: 172716 kB' 'Slab: 411744 kB' 'SReclaimable: 172716 kB' 'SUnreclaim: 239028 kB' 'KernelStack: 15952 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486816 kB' 'Committed_AS: 9441888 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.945 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.945 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.946 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.946 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.947 10:21:46 -- setup/common.sh@33 -- # echo 0 00:03:24.947 10:21:46 -- setup/common.sh@33 -- # return 0 00:03:24.947 10:21:46 -- setup/hugepages.sh@99 -- # surp=0 00:03:24.947 10:21:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:24.947 10:21:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:24.947 10:21:46 -- setup/common.sh@18 -- # local node= 00:03:24.947 10:21:46 -- setup/common.sh@19 -- # local var val 00:03:24.947 10:21:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.947 10:21:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.947 10:21:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.947 10:21:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.947 10:21:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.947 10:21:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45343276 kB' 'MemAvailable: 48916188 kB' 'Buffers: 9788 kB' 'Cached: 10841096 kB' 'SwapCached: 0 kB' 'Active: 8253372 kB' 'Inactive: 3417780 kB' 'Active(anon): 7687452 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 823548 kB' 'Mapped: 144148 kB' 'Shmem: 6867184 kB' 'KReclaimable: 172716 kB' 'Slab: 411744 kB' 'SReclaimable: 172716 kB' 'SUnreclaim: 239028 kB' 'KernelStack: 15952 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486816 kB' 'Committed_AS: 9441904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.947 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.947 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.948 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.948 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.948 10:21:46 -- setup/common.sh@33 -- # echo 0 00:03:24.948 10:21:46 -- setup/common.sh@33 -- # return 0 00:03:24.948 10:21:46 -- setup/hugepages.sh@100 -- # resv=0 00:03:24.948 10:21:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:24.948 nr_hugepages=1025 00:03:24.948 10:21:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:24.948 resv_hugepages=0 00:03:24.948 10:21:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:24.948 surplus_hugepages=0 00:03:24.949 10:21:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:24.949 anon_hugepages=0 00:03:24.949 10:21:46 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:24.949 10:21:46 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:24.949 10:21:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:24.949 10:21:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:24.949 10:21:46 -- setup/common.sh@18 -- # local node= 00:03:24.949 10:21:46 -- setup/common.sh@19 -- # local var val 00:03:24.949 10:21:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.949 10:21:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.949 10:21:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.949 10:21:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.949 10:21:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.949 10:21:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45343876 kB' 'MemAvailable: 48916788 kB' 'Buffers: 9788 kB' 'Cached: 10841108 kB' 'SwapCached: 0 kB' 'Active: 8253608 kB' 'Inactive: 3417780 kB' 'Active(anon): 7687688 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 823748 kB' 'Mapped: 144148 kB' 'Shmem: 6867196 kB' 'KReclaimable: 172716 kB' 'Slab: 411736 kB' 'SReclaimable: 172716 kB' 'SUnreclaim: 239020 kB' 'KernelStack: 15952 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486816 kB' 'Committed_AS: 9444344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.949 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.949 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.950 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.950 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.950 10:21:46 -- setup/common.sh@33 -- # echo 1025 00:03:24.950 10:21:46 -- setup/common.sh@33 -- # return 0 00:03:24.950 10:21:46 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:24.950 10:21:46 -- setup/hugepages.sh@112 -- # get_nodes 00:03:24.950 10:21:46 -- setup/hugepages.sh@27 -- # local node 00:03:24.950 10:21:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:24.950 10:21:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:24.950 10:21:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:24.950 10:21:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:24.950 10:21:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:24.950 10:21:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:24.950 10:21:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:24.950 10:21:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:24.951 10:21:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:24.951 10:21:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:24.951 10:21:46 -- setup/common.sh@18 -- # local node=0 00:03:24.951 10:21:46 -- setup/common.sh@19 -- # local var val 00:03:24.951 10:21:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.951 10:21:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.951 10:21:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:24.951 10:21:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:24.951 10:21:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.951 10:21:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634128 kB' 'MemFree: 21921924 kB' 'MemUsed: 10712204 kB' 'SwapCached: 0 kB' 'Active: 5713968 kB' 'Inactive: 3357500 kB' 'Active(anon): 5309964 kB' 'Inactive(anon): 0 kB' 'Active(file): 404004 kB' 'Inactive(file): 3357500 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8598232 kB' 'Mapped: 48636 kB' 'AnonPages: 476440 kB' 'Shmem: 4836728 kB' 'KernelStack: 9304 kB' 'PageTables: 4744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 100284 kB' 'Slab: 231024 kB' 'SReclaimable: 100284 kB' 'SUnreclaim: 130740 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.951 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.951 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:46 -- setup/common.sh@33 -- # echo 0 00:03:24.952 10:21:46 -- setup/common.sh@33 -- # return 0 00:03:24.952 10:21:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:24.952 10:21:47 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:24.952 10:21:47 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:24.952 10:21:47 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:24.952 10:21:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:24.952 10:21:47 -- setup/common.sh@18 -- # local node=1 00:03:24.952 10:21:47 -- setup/common.sh@19 -- # local var val 00:03:24.952 10:21:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.952 10:21:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.952 10:21:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:24.952 10:21:47 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:24.952 10:21:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.952 10:21:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27661496 kB' 'MemFree: 23422080 kB' 'MemUsed: 4239416 kB' 'SwapCached: 0 kB' 'Active: 2539572 kB' 'Inactive: 60280 kB' 'Active(anon): 2377656 kB' 'Inactive(anon): 0 kB' 'Active(file): 161916 kB' 'Inactive(file): 60280 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2252680 kB' 'Mapped: 95512 kB' 'AnonPages: 347224 kB' 'Shmem: 2030484 kB' 'KernelStack: 6616 kB' 'PageTables: 3360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 72432 kB' 'Slab: 180712 kB' 'SReclaimable: 72432 kB' 'SUnreclaim: 108280 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.952 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.952 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # continue 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.953 10:21:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.953 10:21:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.953 10:21:47 -- setup/common.sh@33 -- # echo 0 00:03:24.953 10:21:47 -- setup/common.sh@33 -- # return 0 00:03:24.953 10:21:47 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:24.953 10:21:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:24.953 10:21:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:24.953 10:21:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:24.953 10:21:47 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:24.953 node0=512 expecting 513 00:03:24.953 10:21:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:24.953 10:21:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:24.953 10:21:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:24.953 10:21:47 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:24.953 node1=513 expecting 512 00:03:24.953 10:21:47 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:24.953 00:03:24.953 real 0m3.721s 00:03:24.953 user 0m1.428s 00:03:24.953 sys 0m2.356s 00:03:24.953 10:21:47 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:24.953 10:21:47 -- common/autotest_common.sh@10 -- # set +x 00:03:24.953 ************************************ 00:03:24.953 END TEST odd_alloc 00:03:24.953 ************************************ 00:03:25.215 10:21:47 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:25.215 10:21:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:25.215 10:21:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:25.215 10:21:47 -- common/autotest_common.sh@10 -- # set +x 00:03:25.215 ************************************ 00:03:25.215 START TEST custom_alloc 00:03:25.215 ************************************ 00:03:25.215 10:21:47 -- common/autotest_common.sh@1111 -- # custom_alloc 00:03:25.215 10:21:47 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:25.215 10:21:47 -- setup/hugepages.sh@169 -- # local node 00:03:25.215 10:21:47 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:25.215 10:21:47 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:25.215 10:21:47 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:25.215 10:21:47 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:25.215 10:21:47 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:25.215 10:21:47 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:25.215 10:21:47 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:25.215 10:21:47 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:25.215 10:21:47 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:25.215 10:21:47 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:25.215 10:21:47 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:25.215 10:21:47 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:25.215 10:21:47 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:25.215 10:21:47 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:25.215 10:21:47 -- setup/hugepages.sh@83 -- # : 256 00:03:25.215 10:21:47 -- setup/hugepages.sh@84 -- # : 1 00:03:25.215 10:21:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:25.215 10:21:47 -- setup/hugepages.sh@83 -- # : 0 00:03:25.215 10:21:47 -- setup/hugepages.sh@84 -- # : 0 00:03:25.215 10:21:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:25.215 10:21:47 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:25.215 10:21:47 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:25.215 10:21:47 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:25.215 10:21:47 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:25.215 10:21:47 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:25.215 10:21:47 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:25.215 10:21:47 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:25.215 10:21:47 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:25.215 10:21:47 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:25.215 10:21:47 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:25.215 10:21:47 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:25.215 10:21:47 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:25.215 10:21:47 -- setup/hugepages.sh@78 -- # return 0 00:03:25.215 10:21:47 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:25.215 10:21:47 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:25.215 10:21:47 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:25.215 10:21:47 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:25.215 10:21:47 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:25.215 10:21:47 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:25.215 10:21:47 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:25.215 10:21:47 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:25.215 10:21:47 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:25.215 10:21:47 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:25.215 10:21:47 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:25.215 10:21:47 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:25.215 10:21:47 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:25.215 10:21:47 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:25.215 10:21:47 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:25.215 10:21:47 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:25.215 10:21:47 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:25.215 10:21:47 -- setup/hugepages.sh@78 -- # return 0 00:03:25.215 10:21:47 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:25.215 10:21:47 -- setup/hugepages.sh@187 -- # setup output 00:03:25.215 10:21:47 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:25.215 10:21:47 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:28.520 0000:5e:00.0 (144d a80a): Already using the vfio-pci driver 00:03:28.521 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:af:00.0 (8086 2701): Already using the vfio-pci driver 00:03:28.521 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:b0:00.0 (8086 4140): Already using the vfio-pci driver 00:03:28.521 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:28.521 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:28.786 10:21:50 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:28.786 10:21:50 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:28.786 10:21:50 -- setup/hugepages.sh@89 -- # local node 00:03:28.786 10:21:50 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:28.786 10:21:50 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:28.786 10:21:50 -- setup/hugepages.sh@92 -- # local surp 00:03:28.786 10:21:50 -- setup/hugepages.sh@93 -- # local resv 00:03:28.786 10:21:50 -- setup/hugepages.sh@94 -- # local anon 00:03:28.786 10:21:50 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:28.786 10:21:50 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:28.786 10:21:50 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:28.786 10:21:50 -- setup/common.sh@18 -- # local node= 00:03:28.786 10:21:50 -- setup/common.sh@19 -- # local var val 00:03:28.786 10:21:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.786 10:21:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.786 10:21:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.786 10:21:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.786 10:21:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.786 10:21:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 44283128 kB' 'MemAvailable: 47856104 kB' 'Buffers: 9788 kB' 'Cached: 10841200 kB' 'SwapCached: 0 kB' 'Active: 8255464 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689544 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825584 kB' 'Mapped: 144272 kB' 'Shmem: 6867288 kB' 'KReclaimable: 172844 kB' 'Slab: 411616 kB' 'SReclaimable: 172844 kB' 'SUnreclaim: 238772 kB' 'KernelStack: 15952 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963552 kB' 'Committed_AS: 9442020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198736 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.786 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.786 10:21:50 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.787 10:21:50 -- setup/common.sh@33 -- # echo 0 00:03:28.787 10:21:50 -- setup/common.sh@33 -- # return 0 00:03:28.787 10:21:50 -- setup/hugepages.sh@97 -- # anon=0 00:03:28.787 10:21:50 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:28.787 10:21:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:28.787 10:21:50 -- setup/common.sh@18 -- # local node= 00:03:28.787 10:21:50 -- setup/common.sh@19 -- # local var val 00:03:28.787 10:21:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.787 10:21:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.787 10:21:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.787 10:21:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.787 10:21:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.787 10:21:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 44283480 kB' 'MemAvailable: 47856424 kB' 'Buffers: 9788 kB' 'Cached: 10841212 kB' 'SwapCached: 0 kB' 'Active: 8255440 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689520 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825568 kB' 'Mapped: 144176 kB' 'Shmem: 6867300 kB' 'KReclaimable: 172780 kB' 'Slab: 411648 kB' 'SReclaimable: 172780 kB' 'SUnreclaim: 238868 kB' 'KernelStack: 15968 kB' 'PageTables: 8256 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963552 kB' 'Committed_AS: 9442404 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.787 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.787 10:21:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.788 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.788 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.789 10:21:50 -- setup/common.sh@33 -- # echo 0 00:03:28.789 10:21:50 -- setup/common.sh@33 -- # return 0 00:03:28.789 10:21:50 -- setup/hugepages.sh@99 -- # surp=0 00:03:28.789 10:21:50 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:28.789 10:21:50 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:28.789 10:21:50 -- setup/common.sh@18 -- # local node= 00:03:28.789 10:21:50 -- setup/common.sh@19 -- # local var val 00:03:28.789 10:21:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.789 10:21:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.789 10:21:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.789 10:21:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.789 10:21:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.789 10:21:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 44283892 kB' 'MemAvailable: 47856836 kB' 'Buffers: 9788 kB' 'Cached: 10841224 kB' 'SwapCached: 0 kB' 'Active: 8255304 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689384 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825448 kB' 'Mapped: 144176 kB' 'Shmem: 6867312 kB' 'KReclaimable: 172780 kB' 'Slab: 411648 kB' 'SReclaimable: 172780 kB' 'SUnreclaim: 238868 kB' 'KernelStack: 15968 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963552 kB' 'Committed_AS: 9442420 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.789 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.789 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.790 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.790 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.791 10:21:50 -- setup/common.sh@33 -- # echo 0 00:03:28.791 10:21:50 -- setup/common.sh@33 -- # return 0 00:03:28.791 10:21:50 -- setup/hugepages.sh@100 -- # resv=0 00:03:28.791 10:21:50 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:28.791 nr_hugepages=1536 00:03:28.791 10:21:50 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:28.791 resv_hugepages=0 00:03:28.791 10:21:50 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:28.791 surplus_hugepages=0 00:03:28.791 10:21:50 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:28.791 anon_hugepages=0 00:03:28.791 10:21:50 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:28.791 10:21:50 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:28.791 10:21:50 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:28.791 10:21:50 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:28.791 10:21:50 -- setup/common.sh@18 -- # local node= 00:03:28.791 10:21:50 -- setup/common.sh@19 -- # local var val 00:03:28.791 10:21:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.791 10:21:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.791 10:21:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.791 10:21:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.791 10:21:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.791 10:21:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 44283892 kB' 'MemAvailable: 47856836 kB' 'Buffers: 9788 kB' 'Cached: 10841224 kB' 'SwapCached: 0 kB' 'Active: 8255292 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689372 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825424 kB' 'Mapped: 144176 kB' 'Shmem: 6867312 kB' 'KReclaimable: 172780 kB' 'Slab: 411648 kB' 'SReclaimable: 172780 kB' 'SUnreclaim: 238868 kB' 'KernelStack: 15968 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963552 kB' 'Committed_AS: 9442436 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.791 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.791 10:21:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # continue 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.792 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.792 10:21:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.055 10:21:50 -- setup/common.sh@33 -- # echo 1536 00:03:29.055 10:21:50 -- setup/common.sh@33 -- # return 0 00:03:29.055 10:21:50 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:29.055 10:21:50 -- setup/hugepages.sh@112 -- # get_nodes 00:03:29.055 10:21:50 -- setup/hugepages.sh@27 -- # local node 00:03:29.055 10:21:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:29.055 10:21:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:29.055 10:21:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:29.055 10:21:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:29.055 10:21:50 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:29.055 10:21:50 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:29.055 10:21:50 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:29.055 10:21:50 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:29.055 10:21:50 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:29.055 10:21:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:29.055 10:21:50 -- setup/common.sh@18 -- # local node=0 00:03:29.055 10:21:50 -- setup/common.sh@19 -- # local var val 00:03:29.055 10:21:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.055 10:21:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.055 10:21:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:29.055 10:21:50 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:29.055 10:21:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.055 10:21:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634128 kB' 'MemFree: 21898400 kB' 'MemUsed: 10735728 kB' 'SwapCached: 0 kB' 'Active: 5718596 kB' 'Inactive: 3357500 kB' 'Active(anon): 5314592 kB' 'Inactive(anon): 0 kB' 'Active(file): 404004 kB' 'Inactive(file): 3357500 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8598304 kB' 'Mapped: 48664 kB' 'AnonPages: 481028 kB' 'Shmem: 4836800 kB' 'KernelStack: 9336 kB' 'PageTables: 4816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 100220 kB' 'Slab: 230828 kB' 'SReclaimable: 100220 kB' 'SUnreclaim: 130608 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.055 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.055 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.056 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.056 10:21:50 -- setup/common.sh@33 -- # echo 0 00:03:29.056 10:21:50 -- setup/common.sh@33 -- # return 0 00:03:29.056 10:21:50 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:29.056 10:21:50 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:29.056 10:21:50 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:29.056 10:21:50 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:29.056 10:21:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:29.056 10:21:50 -- setup/common.sh@18 -- # local node=1 00:03:29.056 10:21:50 -- setup/common.sh@19 -- # local var val 00:03:29.056 10:21:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.056 10:21:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.056 10:21:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:29.056 10:21:50 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:29.056 10:21:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.056 10:21:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.056 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27661496 kB' 'MemFree: 22386980 kB' 'MemUsed: 5274516 kB' 'SwapCached: 0 kB' 'Active: 2536460 kB' 'Inactive: 60280 kB' 'Active(anon): 2374544 kB' 'Inactive(anon): 0 kB' 'Active(file): 161916 kB' 'Inactive(file): 60280 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2252752 kB' 'Mapped: 95512 kB' 'AnonPages: 344068 kB' 'Shmem: 2030556 kB' 'KernelStack: 6632 kB' 'PageTables: 3420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 72560 kB' 'Slab: 180820 kB' 'SReclaimable: 72560 kB' 'SUnreclaim: 108260 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # continue 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.057 10:21:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.057 10:21:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.057 10:21:50 -- setup/common.sh@33 -- # echo 0 00:03:29.057 10:21:50 -- setup/common.sh@33 -- # return 0 00:03:29.057 10:21:50 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:29.057 10:21:50 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:29.057 10:21:50 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:29.057 10:21:50 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:29.057 10:21:50 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:29.057 node0=512 expecting 512 00:03:29.057 10:21:50 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:29.057 10:21:50 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:29.057 10:21:50 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:29.057 10:21:50 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:29.057 node1=1024 expecting 1024 00:03:29.057 10:21:50 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:29.057 00:03:29.057 real 0m3.745s 00:03:29.057 user 0m1.355s 00:03:29.057 sys 0m2.454s 00:03:29.057 10:21:50 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:29.057 10:21:50 -- common/autotest_common.sh@10 -- # set +x 00:03:29.057 ************************************ 00:03:29.057 END TEST custom_alloc 00:03:29.058 ************************************ 00:03:29.058 10:21:50 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:29.058 10:21:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:29.058 10:21:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:29.058 10:21:50 -- common/autotest_common.sh@10 -- # set +x 00:03:29.058 ************************************ 00:03:29.058 START TEST no_shrink_alloc 00:03:29.058 ************************************ 00:03:29.058 10:21:51 -- common/autotest_common.sh@1111 -- # no_shrink_alloc 00:03:29.058 10:21:51 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:29.058 10:21:51 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:29.058 10:21:51 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:29.058 10:21:51 -- setup/hugepages.sh@51 -- # shift 00:03:29.058 10:21:51 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:29.058 10:21:51 -- setup/hugepages.sh@52 -- # local node_ids 00:03:29.058 10:21:51 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:29.058 10:21:51 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:29.058 10:21:51 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:29.058 10:21:51 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:29.058 10:21:51 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:29.058 10:21:51 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:29.058 10:21:51 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:29.058 10:21:51 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:29.058 10:21:51 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:29.058 10:21:51 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:29.058 10:21:51 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:29.058 10:21:51 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:29.058 10:21:51 -- setup/hugepages.sh@73 -- # return 0 00:03:29.058 10:21:51 -- setup/hugepages.sh@198 -- # setup output 00:03:29.058 10:21:51 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:29.058 10:21:51 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:32.362 0000:5e:00.0 (144d a80a): Already using the vfio-pci driver 00:03:32.362 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:af:00.0 (8086 2701): Already using the vfio-pci driver 00:03:32.362 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:32.362 0000:b0:00.0 (8086 4140): Already using the vfio-pci driver 00:03:32.626 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:32.626 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:32.626 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:32.626 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:32.626 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:32.626 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:32.626 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:32.626 10:21:54 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:32.626 10:21:54 -- setup/hugepages.sh@89 -- # local node 00:03:32.626 10:21:54 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:32.626 10:21:54 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:32.626 10:21:54 -- setup/hugepages.sh@92 -- # local surp 00:03:32.626 10:21:54 -- setup/hugepages.sh@93 -- # local resv 00:03:32.626 10:21:54 -- setup/hugepages.sh@94 -- # local anon 00:03:32.626 10:21:54 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:32.626 10:21:54 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:32.626 10:21:54 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:32.626 10:21:54 -- setup/common.sh@18 -- # local node= 00:03:32.626 10:21:54 -- setup/common.sh@19 -- # local var val 00:03:32.626 10:21:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.626 10:21:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.626 10:21:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:32.626 10:21:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:32.626 10:21:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.626 10:21:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45328924 kB' 'MemAvailable: 48901884 kB' 'Buffers: 9788 kB' 'Cached: 10841320 kB' 'SwapCached: 0 kB' 'Active: 8256024 kB' 'Inactive: 3417780 kB' 'Active(anon): 7690104 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825460 kB' 'Mapped: 144276 kB' 'Shmem: 6867408 kB' 'KReclaimable: 172812 kB' 'Slab: 411720 kB' 'SReclaimable: 172812 kB' 'SUnreclaim: 238908 kB' 'KernelStack: 15984 kB' 'PageTables: 8192 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9442896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198704 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.626 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.626 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.627 10:21:54 -- setup/common.sh@33 -- # echo 0 00:03:32.627 10:21:54 -- setup/common.sh@33 -- # return 0 00:03:32.627 10:21:54 -- setup/hugepages.sh@97 -- # anon=0 00:03:32.627 10:21:54 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:32.627 10:21:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:32.627 10:21:54 -- setup/common.sh@18 -- # local node= 00:03:32.627 10:21:54 -- setup/common.sh@19 -- # local var val 00:03:32.627 10:21:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.627 10:21:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.627 10:21:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:32.627 10:21:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:32.627 10:21:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.627 10:21:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45335068 kB' 'MemAvailable: 48908028 kB' 'Buffers: 9788 kB' 'Cached: 10841324 kB' 'SwapCached: 0 kB' 'Active: 8255316 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689396 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825224 kB' 'Mapped: 144200 kB' 'Shmem: 6867412 kB' 'KReclaimable: 172812 kB' 'Slab: 411688 kB' 'SReclaimable: 172812 kB' 'SUnreclaim: 238876 kB' 'KernelStack: 15984 kB' 'PageTables: 8188 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9442904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.627 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.627 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.628 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.628 10:21:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.629 10:21:54 -- setup/common.sh@33 -- # echo 0 00:03:32.629 10:21:54 -- setup/common.sh@33 -- # return 0 00:03:32.629 10:21:54 -- setup/hugepages.sh@99 -- # surp=0 00:03:32.629 10:21:54 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:32.629 10:21:54 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:32.629 10:21:54 -- setup/common.sh@18 -- # local node= 00:03:32.629 10:21:54 -- setup/common.sh@19 -- # local var val 00:03:32.629 10:21:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.629 10:21:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.629 10:21:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:32.629 10:21:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:32.629 10:21:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.629 10:21:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45335068 kB' 'MemAvailable: 48908028 kB' 'Buffers: 9788 kB' 'Cached: 10841324 kB' 'SwapCached: 0 kB' 'Active: 8255352 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689432 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825256 kB' 'Mapped: 144200 kB' 'Shmem: 6867412 kB' 'KReclaimable: 172812 kB' 'Slab: 411688 kB' 'SReclaimable: 172812 kB' 'SUnreclaim: 238876 kB' 'KernelStack: 16000 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9442920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.629 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.629 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.892 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.892 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.893 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.893 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.893 10:21:54 -- setup/common.sh@33 -- # echo 0 00:03:32.893 10:21:54 -- setup/common.sh@33 -- # return 0 00:03:32.893 10:21:54 -- setup/hugepages.sh@100 -- # resv=0 00:03:32.893 10:21:54 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:32.893 nr_hugepages=1024 00:03:32.893 10:21:54 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:32.893 resv_hugepages=0 00:03:32.893 10:21:54 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:32.893 surplus_hugepages=0 00:03:32.893 10:21:54 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:32.893 anon_hugepages=0 00:03:32.893 10:21:54 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:32.893 10:21:54 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:32.893 10:21:54 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:32.893 10:21:54 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:32.893 10:21:54 -- setup/common.sh@18 -- # local node= 00:03:32.893 10:21:54 -- setup/common.sh@19 -- # local var val 00:03:32.894 10:21:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.894 10:21:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.894 10:21:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:32.894 10:21:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:32.894 10:21:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.894 10:21:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.894 10:21:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45336504 kB' 'MemAvailable: 48909464 kB' 'Buffers: 9788 kB' 'Cached: 10841360 kB' 'SwapCached: 0 kB' 'Active: 8255044 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689124 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 824828 kB' 'Mapped: 144200 kB' 'Shmem: 6867448 kB' 'KReclaimable: 172812 kB' 'Slab: 411688 kB' 'SReclaimable: 172812 kB' 'SUnreclaim: 238876 kB' 'KernelStack: 15968 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9442936 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198688 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.894 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.894 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.895 10:21:54 -- setup/common.sh@33 -- # echo 1024 00:03:32.895 10:21:54 -- setup/common.sh@33 -- # return 0 00:03:32.895 10:21:54 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:32.895 10:21:54 -- setup/hugepages.sh@112 -- # get_nodes 00:03:32.895 10:21:54 -- setup/hugepages.sh@27 -- # local node 00:03:32.895 10:21:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:32.895 10:21:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:32.895 10:21:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:32.895 10:21:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:32.895 10:21:54 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:32.895 10:21:54 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:32.895 10:21:54 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:32.895 10:21:54 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:32.895 10:21:54 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:32.895 10:21:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:32.895 10:21:54 -- setup/common.sh@18 -- # local node=0 00:03:32.895 10:21:54 -- setup/common.sh@19 -- # local var val 00:03:32.895 10:21:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.895 10:21:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.895 10:21:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:32.895 10:21:54 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:32.895 10:21:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.895 10:21:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634128 kB' 'MemFree: 20854468 kB' 'MemUsed: 11779660 kB' 'SwapCached: 0 kB' 'Active: 5722700 kB' 'Inactive: 3357500 kB' 'Active(anon): 5318696 kB' 'Inactive(anon): 0 kB' 'Active(file): 404004 kB' 'Inactive(file): 3357500 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8598328 kB' 'Mapped: 48688 kB' 'AnonPages: 484984 kB' 'Shmem: 4836824 kB' 'KernelStack: 9336 kB' 'PageTables: 4820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 100188 kB' 'Slab: 230740 kB' 'SReclaimable: 100188 kB' 'SUnreclaim: 130552 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.895 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.895 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # continue 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.896 10:21:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.896 10:21:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.896 10:21:54 -- setup/common.sh@33 -- # echo 0 00:03:32.896 10:21:54 -- setup/common.sh@33 -- # return 0 00:03:32.896 10:21:54 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:32.896 10:21:54 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:32.896 10:21:54 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:32.896 10:21:54 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:32.896 10:21:54 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:32.896 node0=1024 expecting 1024 00:03:32.896 10:21:54 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:32.896 10:21:54 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:32.896 10:21:54 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:32.896 10:21:54 -- setup/hugepages.sh@202 -- # setup output 00:03:32.896 10:21:54 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:32.896 10:21:54 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:36.200 0000:5e:00.0 (144d a80a): Already using the vfio-pci driver 00:03:36.200 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:af:00.0 (8086 2701): Already using the vfio-pci driver 00:03:36.200 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:b0:00.0 (8086 4140): Already using the vfio-pci driver 00:03:36.200 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:36.200 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:36.200 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:36.466 10:21:58 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:36.466 10:21:58 -- setup/hugepages.sh@89 -- # local node 00:03:36.466 10:21:58 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:36.466 10:21:58 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:36.466 10:21:58 -- setup/hugepages.sh@92 -- # local surp 00:03:36.466 10:21:58 -- setup/hugepages.sh@93 -- # local resv 00:03:36.466 10:21:58 -- setup/hugepages.sh@94 -- # local anon 00:03:36.466 10:21:58 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:36.466 10:21:58 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:36.466 10:21:58 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:36.466 10:21:58 -- setup/common.sh@18 -- # local node= 00:03:36.466 10:21:58 -- setup/common.sh@19 -- # local var val 00:03:36.466 10:21:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.466 10:21:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.466 10:21:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.466 10:21:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.466 10:21:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.466 10:21:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45340824 kB' 'MemAvailable: 48913784 kB' 'Buffers: 9788 kB' 'Cached: 10841412 kB' 'SwapCached: 0 kB' 'Active: 8255500 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689580 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825404 kB' 'Mapped: 144252 kB' 'Shmem: 6867500 kB' 'KReclaimable: 172812 kB' 'Slab: 411212 kB' 'SReclaimable: 172812 kB' 'SUnreclaim: 238400 kB' 'KernelStack: 16048 kB' 'PageTables: 8340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9443376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198768 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.466 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.466 10:21:58 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.467 10:21:58 -- setup/common.sh@33 -- # echo 0 00:03:36.467 10:21:58 -- setup/common.sh@33 -- # return 0 00:03:36.467 10:21:58 -- setup/hugepages.sh@97 -- # anon=0 00:03:36.467 10:21:58 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:36.467 10:21:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:36.467 10:21:58 -- setup/common.sh@18 -- # local node= 00:03:36.467 10:21:58 -- setup/common.sh@19 -- # local var val 00:03:36.467 10:21:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.467 10:21:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.467 10:21:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.467 10:21:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.467 10:21:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.467 10:21:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45351836 kB' 'MemAvailable: 48924796 kB' 'Buffers: 9788 kB' 'Cached: 10841416 kB' 'SwapCached: 0 kB' 'Active: 8255704 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689784 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825616 kB' 'Mapped: 144244 kB' 'Shmem: 6867504 kB' 'KReclaimable: 172812 kB' 'Slab: 411216 kB' 'SReclaimable: 172812 kB' 'SUnreclaim: 238404 kB' 'KernelStack: 16016 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9443388 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.467 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.467 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.468 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.468 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.469 10:21:58 -- setup/common.sh@33 -- # echo 0 00:03:36.469 10:21:58 -- setup/common.sh@33 -- # return 0 00:03:36.469 10:21:58 -- setup/hugepages.sh@99 -- # surp=0 00:03:36.469 10:21:58 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:36.469 10:21:58 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:36.469 10:21:58 -- setup/common.sh@18 -- # local node= 00:03:36.469 10:21:58 -- setup/common.sh@19 -- # local var val 00:03:36.469 10:21:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.469 10:21:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.469 10:21:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.469 10:21:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.469 10:21:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.469 10:21:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45351080 kB' 'MemAvailable: 48924040 kB' 'Buffers: 9788 kB' 'Cached: 10841428 kB' 'SwapCached: 0 kB' 'Active: 8255728 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689808 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825608 kB' 'Mapped: 144244 kB' 'Shmem: 6867516 kB' 'KReclaimable: 172812 kB' 'Slab: 411216 kB' 'SReclaimable: 172812 kB' 'SUnreclaim: 238404 kB' 'KernelStack: 16016 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9443400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.469 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.469 10:21:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.470 10:21:58 -- setup/common.sh@33 -- # echo 0 00:03:36.470 10:21:58 -- setup/common.sh@33 -- # return 0 00:03:36.470 10:21:58 -- setup/hugepages.sh@100 -- # resv=0 00:03:36.470 10:21:58 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:36.470 nr_hugepages=1024 00:03:36.470 10:21:58 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:36.470 resv_hugepages=0 00:03:36.470 10:21:58 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:36.470 surplus_hugepages=0 00:03:36.470 10:21:58 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:36.470 anon_hugepages=0 00:03:36.470 10:21:58 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:36.470 10:21:58 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:36.470 10:21:58 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:36.470 10:21:58 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:36.470 10:21:58 -- setup/common.sh@18 -- # local node= 00:03:36.470 10:21:58 -- setup/common.sh@19 -- # local var val 00:03:36.470 10:21:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.470 10:21:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.470 10:21:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.470 10:21:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.470 10:21:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.470 10:21:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.470 10:21:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295624 kB' 'MemFree: 45351080 kB' 'MemAvailable: 48924040 kB' 'Buffers: 9788 kB' 'Cached: 10841428 kB' 'SwapCached: 0 kB' 'Active: 8255760 kB' 'Inactive: 3417780 kB' 'Active(anon): 7689840 kB' 'Inactive(anon): 0 kB' 'Active(file): 565920 kB' 'Inactive(file): 3417780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 825640 kB' 'Mapped: 144244 kB' 'Shmem: 6867516 kB' 'KReclaimable: 172812 kB' 'Slab: 411216 kB' 'SReclaimable: 172812 kB' 'SUnreclaim: 238404 kB' 'KernelStack: 16032 kB' 'PageTables: 8292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487840 kB' 'Committed_AS: 9443048 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198752 kB' 'VmallocChunk: 0 kB' 'Percpu: 47680 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 441824 kB' 'DirectMap2M: 7622656 kB' 'DirectMap1G: 60817408 kB' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.470 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.470 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.471 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.471 10:21:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.472 10:21:58 -- setup/common.sh@33 -- # echo 1024 00:03:36.472 10:21:58 -- setup/common.sh@33 -- # return 0 00:03:36.472 10:21:58 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:36.472 10:21:58 -- setup/hugepages.sh@112 -- # get_nodes 00:03:36.472 10:21:58 -- setup/hugepages.sh@27 -- # local node 00:03:36.472 10:21:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:36.472 10:21:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:36.472 10:21:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:36.472 10:21:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:36.472 10:21:58 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:36.472 10:21:58 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:36.472 10:21:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:36.472 10:21:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:36.472 10:21:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:36.472 10:21:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:36.472 10:21:58 -- setup/common.sh@18 -- # local node=0 00:03:36.472 10:21:58 -- setup/common.sh@19 -- # local var val 00:03:36.472 10:21:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.472 10:21:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.472 10:21:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:36.472 10:21:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:36.472 10:21:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.472 10:21:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.472 10:21:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634128 kB' 'MemFree: 20862308 kB' 'MemUsed: 11771820 kB' 'SwapCached: 0 kB' 'Active: 5725808 kB' 'Inactive: 3357500 kB' 'Active(anon): 5321804 kB' 'Inactive(anon): 0 kB' 'Active(file): 404004 kB' 'Inactive(file): 3357500 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8598352 kB' 'Mapped: 48732 kB' 'AnonPages: 488232 kB' 'Shmem: 4836848 kB' 'KernelStack: 9320 kB' 'PageTables: 4760 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 100188 kB' 'Slab: 230516 kB' 'SReclaimable: 100188 kB' 'SUnreclaim: 130328 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.472 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.472 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # continue 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.473 10:21:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.473 10:21:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.473 10:21:58 -- setup/common.sh@33 -- # echo 0 00:03:36.473 10:21:58 -- setup/common.sh@33 -- # return 0 00:03:36.473 10:21:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:36.473 10:21:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:36.473 10:21:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:36.473 10:21:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:36.473 10:21:58 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:36.473 node0=1024 expecting 1024 00:03:36.473 10:21:58 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:36.473 00:03:36.473 real 0m7.375s 00:03:36.473 user 0m2.763s 00:03:36.473 sys 0m4.726s 00:03:36.473 10:21:58 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:36.473 10:21:58 -- common/autotest_common.sh@10 -- # set +x 00:03:36.473 ************************************ 00:03:36.473 END TEST no_shrink_alloc 00:03:36.473 ************************************ 00:03:36.473 10:21:58 -- setup/hugepages.sh@217 -- # clear_hp 00:03:36.473 10:21:58 -- setup/hugepages.sh@37 -- # local node hp 00:03:36.473 10:21:58 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:36.473 10:21:58 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:36.473 10:21:58 -- setup/hugepages.sh@41 -- # echo 0 00:03:36.473 10:21:58 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:36.473 10:21:58 -- setup/hugepages.sh@41 -- # echo 0 00:03:36.473 10:21:58 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:36.473 10:21:58 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:36.473 10:21:58 -- setup/hugepages.sh@41 -- # echo 0 00:03:36.473 10:21:58 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:36.473 10:21:58 -- setup/hugepages.sh@41 -- # echo 0 00:03:36.473 10:21:58 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:36.473 10:21:58 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:36.473 00:03:36.473 real 0m27.592s 00:03:36.473 user 0m10.325s 00:03:36.473 sys 0m17.607s 00:03:36.473 10:21:58 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:36.473 10:21:58 -- common/autotest_common.sh@10 -- # set +x 00:03:36.473 ************************************ 00:03:36.473 END TEST hugepages 00:03:36.473 ************************************ 00:03:36.734 10:21:58 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:36.734 10:21:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:36.734 10:21:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:36.734 10:21:58 -- common/autotest_common.sh@10 -- # set +x 00:03:36.734 ************************************ 00:03:36.734 START TEST driver 00:03:36.734 ************************************ 00:03:36.734 10:21:58 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:36.734 * Looking for test storage... 00:03:36.734 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:36.734 10:21:58 -- setup/driver.sh@68 -- # setup reset 00:03:36.734 10:21:58 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:36.734 10:21:58 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:42.024 10:22:04 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:42.024 10:22:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:42.024 10:22:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:42.024 10:22:04 -- common/autotest_common.sh@10 -- # set +x 00:03:42.284 ************************************ 00:03:42.284 START TEST guess_driver 00:03:42.284 ************************************ 00:03:42.284 10:22:04 -- common/autotest_common.sh@1111 -- # guess_driver 00:03:42.284 10:22:04 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:42.284 10:22:04 -- setup/driver.sh@47 -- # local fail=0 00:03:42.284 10:22:04 -- setup/driver.sh@49 -- # pick_driver 00:03:42.284 10:22:04 -- setup/driver.sh@36 -- # vfio 00:03:42.284 10:22:04 -- setup/driver.sh@21 -- # local iommu_grups 00:03:42.284 10:22:04 -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:42.284 10:22:04 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:42.284 10:22:04 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:42.284 10:22:04 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:42.284 10:22:04 -- setup/driver.sh@29 -- # (( 164 > 0 )) 00:03:42.284 10:22:04 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:42.284 10:22:04 -- setup/driver.sh@14 -- # mod vfio_pci 00:03:42.284 10:22:04 -- setup/driver.sh@12 -- # dep vfio_pci 00:03:42.284 10:22:04 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:42.284 10:22:04 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:42.284 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:42.284 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:42.284 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:42.284 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:42.284 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:42.284 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:42.284 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:42.284 10:22:04 -- setup/driver.sh@30 -- # return 0 00:03:42.284 10:22:04 -- setup/driver.sh@37 -- # echo vfio-pci 00:03:42.284 10:22:04 -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:42.284 10:22:04 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:42.284 10:22:04 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:42.284 Looking for driver=vfio-pci 00:03:42.284 10:22:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:42.284 10:22:04 -- setup/driver.sh@45 -- # setup output config 00:03:42.284 10:22:04 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:42.284 10:22:04 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:45.585 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.585 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.585 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.585 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.585 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.585 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.585 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.585 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.585 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.585 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.845 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:45.846 10:22:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:45.846 10:22:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:45.846 10:22:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:46.106 10:22:08 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:03:46.106 10:22:08 -- setup/driver.sh@65 -- # setup reset 00:03:46.106 10:22:08 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:46.106 10:22:08 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:51.393 00:03:51.393 real 0m9.061s 00:03:51.393 user 0m2.958s 00:03:51.393 sys 0m5.264s 00:03:51.393 10:22:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:51.393 10:22:13 -- common/autotest_common.sh@10 -- # set +x 00:03:51.393 ************************************ 00:03:51.393 END TEST guess_driver 00:03:51.393 ************************************ 00:03:51.393 00:03:51.393 real 0m14.521s 00:03:51.393 user 0m4.463s 00:03:51.393 sys 0m8.299s 00:03:51.393 10:22:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:51.393 10:22:13 -- common/autotest_common.sh@10 -- # set +x 00:03:51.393 ************************************ 00:03:51.393 END TEST driver 00:03:51.393 ************************************ 00:03:51.393 10:22:13 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:51.393 10:22:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:51.393 10:22:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:51.393 10:22:13 -- common/autotest_common.sh@10 -- # set +x 00:03:51.393 ************************************ 00:03:51.393 START TEST devices 00:03:51.393 ************************************ 00:03:51.393 10:22:13 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:51.654 * Looking for test storage... 00:03:51.654 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:51.654 10:22:13 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:03:51.654 10:22:13 -- setup/devices.sh@192 -- # setup reset 00:03:51.654 10:22:13 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:51.654 10:22:13 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:55.864 10:22:17 -- setup/devices.sh@194 -- # get_zoned_devs 00:03:55.864 10:22:17 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:03:55.864 10:22:17 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:03:55.864 10:22:17 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:03:55.864 10:22:17 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:03:55.864 10:22:17 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:03:55.864 10:22:17 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:03:55.864 10:22:17 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:55.864 10:22:17 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:03:55.864 10:22:17 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:03:55.864 10:22:17 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme1n1 00:03:55.864 10:22:17 -- common/autotest_common.sh@1648 -- # local device=nvme1n1 00:03:55.864 10:22:17 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:55.864 10:22:17 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:03:55.864 10:22:17 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:03:55.864 10:22:17 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme2n1 00:03:55.864 10:22:17 -- common/autotest_common.sh@1648 -- # local device=nvme2n1 00:03:55.864 10:22:17 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme2n1/queue/zoned ]] 00:03:55.864 10:22:17 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:03:55.864 10:22:17 -- setup/devices.sh@196 -- # blocks=() 00:03:55.864 10:22:17 -- setup/devices.sh@196 -- # declare -a blocks 00:03:55.864 10:22:17 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:03:55.864 10:22:17 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:03:55.864 10:22:17 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:03:55.864 10:22:17 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:03:55.864 10:22:17 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:03:55.864 10:22:17 -- setup/devices.sh@201 -- # ctrl=nvme0 00:03:55.864 10:22:17 -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:03:55.864 10:22:17 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:03:55.864 10:22:17 -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:03:55.864 10:22:17 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:03:55.864 No valid GPT data, bailing 00:03:55.864 10:22:17 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:55.864 10:22:17 -- scripts/common.sh@391 -- # pt= 00:03:55.864 10:22:17 -- scripts/common.sh@392 -- # return 1 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:03:55.864 10:22:17 -- setup/common.sh@76 -- # local dev=nvme0n1 00:03:55.864 10:22:17 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:03:55.864 10:22:17 -- setup/common.sh@80 -- # echo 1920383410176 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # (( 1920383410176 >= min_disk_size )) 00:03:55.864 10:22:17 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:03:55.864 10:22:17 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:03:55.864 10:22:17 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:03:55.864 10:22:17 -- setup/devices.sh@201 -- # ctrl=nvme1n1 00:03:55.864 10:22:17 -- setup/devices.sh@201 -- # ctrl=nvme1 00:03:55.864 10:22:17 -- setup/devices.sh@202 -- # pci=0000:af:00.0 00:03:55.864 10:22:17 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\a\f\:\0\0\.\0* ]] 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # block_in_use nvme1n1 00:03:55.864 10:22:17 -- scripts/common.sh@378 -- # local block=nvme1n1 pt 00:03:55.864 10:22:17 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme1n1 00:03:55.864 No valid GPT data, bailing 00:03:55.864 10:22:17 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme1n1 00:03:55.864 10:22:17 -- scripts/common.sh@391 -- # pt= 00:03:55.864 10:22:17 -- scripts/common.sh@392 -- # return 1 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme1n1 00:03:55.864 10:22:17 -- setup/common.sh@76 -- # local dev=nvme1n1 00:03:55.864 10:22:17 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme1n1 ]] 00:03:55.864 10:22:17 -- setup/common.sh@80 -- # echo 375083606016 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # (( 375083606016 >= min_disk_size )) 00:03:55.864 10:22:17 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:03:55.864 10:22:17 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:af:00.0 00:03:55.864 10:22:17 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:03:55.864 10:22:17 -- setup/devices.sh@201 -- # ctrl=nvme2n1 00:03:55.864 10:22:17 -- setup/devices.sh@201 -- # ctrl=nvme2 00:03:55.864 10:22:17 -- setup/devices.sh@202 -- # pci=0000:b0:00.0 00:03:55.864 10:22:17 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\b\0\:\0\0\.\0* ]] 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # block_in_use nvme2n1 00:03:55.864 10:22:17 -- scripts/common.sh@378 -- # local block=nvme2n1 pt 00:03:55.864 10:22:17 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme2n1 00:03:55.864 No valid GPT data, bailing 00:03:55.864 10:22:17 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme2n1 00:03:55.864 10:22:17 -- scripts/common.sh@391 -- # pt= 00:03:55.864 10:22:17 -- scripts/common.sh@392 -- # return 1 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme2n1 00:03:55.864 10:22:17 -- setup/common.sh@76 -- # local dev=nvme2n1 00:03:55.864 10:22:17 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme2n1 ]] 00:03:55.864 10:22:17 -- setup/common.sh@80 -- # echo 800166076416 00:03:55.864 10:22:17 -- setup/devices.sh@204 -- # (( 800166076416 >= min_disk_size )) 00:03:55.864 10:22:17 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:03:55.864 10:22:17 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:b0:00.0 00:03:55.864 10:22:17 -- setup/devices.sh@209 -- # (( 3 > 0 )) 00:03:55.864 10:22:17 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:03:55.864 10:22:17 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:03:55.864 10:22:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:55.864 10:22:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:55.864 10:22:17 -- common/autotest_common.sh@10 -- # set +x 00:03:55.864 ************************************ 00:03:55.864 START TEST nvme_mount 00:03:55.864 ************************************ 00:03:55.864 10:22:17 -- common/autotest_common.sh@1111 -- # nvme_mount 00:03:55.864 10:22:17 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:03:55.864 10:22:17 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:03:55.864 10:22:17 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:55.864 10:22:17 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:55.864 10:22:17 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:03:55.864 10:22:17 -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:55.864 10:22:17 -- setup/common.sh@40 -- # local part_no=1 00:03:55.864 10:22:17 -- setup/common.sh@41 -- # local size=1073741824 00:03:55.864 10:22:17 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:55.864 10:22:17 -- setup/common.sh@44 -- # parts=() 00:03:55.864 10:22:17 -- setup/common.sh@44 -- # local parts 00:03:55.864 10:22:17 -- setup/common.sh@46 -- # (( part = 1 )) 00:03:55.864 10:22:17 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:55.864 10:22:17 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:55.864 10:22:17 -- setup/common.sh@46 -- # (( part++ )) 00:03:55.864 10:22:17 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:55.864 10:22:17 -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:55.864 10:22:17 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:55.864 10:22:17 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:03:57.251 Creating new GPT entries in memory. 00:03:57.251 GPT data structures destroyed! You may now partition the disk using fdisk or 00:03:57.251 other utilities. 00:03:57.251 10:22:18 -- setup/common.sh@57 -- # (( part = 1 )) 00:03:57.251 10:22:18 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:57.251 10:22:18 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:57.251 10:22:18 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:57.251 10:22:18 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:03:58.205 Creating new GPT entries in memory. 00:03:58.205 The operation has completed successfully. 00:03:58.205 10:22:20 -- setup/common.sh@57 -- # (( part++ )) 00:03:58.205 10:22:20 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:58.206 10:22:20 -- setup/common.sh@62 -- # wait 168853 00:03:58.206 10:22:20 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:58.206 10:22:20 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:03:58.206 10:22:20 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:58.206 10:22:20 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:03:58.206 10:22:20 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:03:58.206 10:22:20 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:58.206 10:22:20 -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:58.206 10:22:20 -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:03:58.206 10:22:20 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:03:58.206 10:22:20 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:58.206 10:22:20 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:58.206 10:22:20 -- setup/devices.sh@53 -- # local found=0 00:03:58.206 10:22:20 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:58.206 10:22:20 -- setup/devices.sh@56 -- # : 00:03:58.206 10:22:20 -- setup/devices.sh@59 -- # local pci status 00:03:58.206 10:22:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:58.206 10:22:20 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:03:58.206 10:22:20 -- setup/devices.sh@47 -- # setup output config 00:03:58.206 10:22:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:58.206 10:22:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:01.508 10:22:23 -- setup/devices.sh@63 -- # found=1 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:af:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:b0:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.508 10:22:23 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:01.508 10:22:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.769 10:22:23 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:01.769 10:22:23 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:01.769 10:22:23 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:01.769 10:22:23 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:01.769 10:22:23 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:01.769 10:22:23 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:01.769 10:22:23 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:01.769 10:22:23 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:01.769 10:22:23 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:01.769 10:22:23 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:01.769 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:01.769 10:22:23 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:01.769 10:22:23 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:02.030 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:02.030 /dev/nvme0n1: 8 bytes were erased at offset 0x1bf1fc55e00 (gpt): 45 46 49 20 50 41 52 54 00:04:02.030 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:02.030 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:02.030 10:22:23 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:02.030 10:22:23 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:02.030 10:22:23 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:02.030 10:22:23 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:02.030 10:22:23 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:02.030 10:22:23 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:02.030 10:22:24 -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:02.030 10:22:24 -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:02.030 10:22:24 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:02.030 10:22:24 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:02.030 10:22:24 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:02.030 10:22:24 -- setup/devices.sh@53 -- # local found=0 00:04:02.030 10:22:24 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:02.030 10:22:24 -- setup/devices.sh@56 -- # : 00:04:02.030 10:22:24 -- setup/devices.sh@59 -- # local pci status 00:04:02.030 10:22:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:02.030 10:22:24 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:02.030 10:22:24 -- setup/devices.sh@47 -- # setup output config 00:04:02.030 10:22:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.030 10:22:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:05.500 10:22:27 -- setup/devices.sh@63 -- # found=1 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:af:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:b0:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.500 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.500 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.501 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.501 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.501 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.501 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.501 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.501 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.501 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.501 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.501 10:22:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:05.501 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.501 10:22:27 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:05.501 10:22:27 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:05.501 10:22:27 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:05.501 10:22:27 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:05.501 10:22:27 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:05.501 10:22:27 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:05.761 10:22:27 -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:04:05.761 10:22:27 -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:05.761 10:22:27 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:05.761 10:22:27 -- setup/devices.sh@50 -- # local mount_point= 00:04:05.761 10:22:27 -- setup/devices.sh@51 -- # local test_file= 00:04:05.761 10:22:27 -- setup/devices.sh@53 -- # local found=0 00:04:05.761 10:22:27 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:05.761 10:22:27 -- setup/devices.sh@59 -- # local pci status 00:04:05.761 10:22:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:05.761 10:22:27 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:05.761 10:22:27 -- setup/devices.sh@47 -- # setup output config 00:04:05.761 10:22:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:05.761 10:22:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:09.064 10:22:30 -- setup/devices.sh@63 -- # found=1 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:af:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:30 -- setup/devices.sh@62 -- # [[ 0000:b0:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.064 10:22:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:09.064 10:22:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.325 10:22:31 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:09.325 10:22:31 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:09.325 10:22:31 -- setup/devices.sh@68 -- # return 0 00:04:09.325 10:22:31 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:09.325 10:22:31 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:09.325 10:22:31 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:09.325 10:22:31 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:09.325 10:22:31 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:09.325 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:09.325 00:04:09.325 real 0m13.276s 00:04:09.325 user 0m4.032s 00:04:09.325 sys 0m7.222s 00:04:09.325 10:22:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:09.325 10:22:31 -- common/autotest_common.sh@10 -- # set +x 00:04:09.325 ************************************ 00:04:09.325 END TEST nvme_mount 00:04:09.325 ************************************ 00:04:09.325 10:22:31 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:09.325 10:22:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:09.325 10:22:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:09.325 10:22:31 -- common/autotest_common.sh@10 -- # set +x 00:04:09.325 ************************************ 00:04:09.325 START TEST dm_mount 00:04:09.325 ************************************ 00:04:09.325 10:22:31 -- common/autotest_common.sh@1111 -- # dm_mount 00:04:09.325 10:22:31 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:09.325 10:22:31 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:09.325 10:22:31 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:09.325 10:22:31 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:09.325 10:22:31 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:09.325 10:22:31 -- setup/common.sh@40 -- # local part_no=2 00:04:09.325 10:22:31 -- setup/common.sh@41 -- # local size=1073741824 00:04:09.325 10:22:31 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:09.325 10:22:31 -- setup/common.sh@44 -- # parts=() 00:04:09.325 10:22:31 -- setup/common.sh@44 -- # local parts 00:04:09.325 10:22:31 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:09.325 10:22:31 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:09.325 10:22:31 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:09.325 10:22:31 -- setup/common.sh@46 -- # (( part++ )) 00:04:09.325 10:22:31 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:09.325 10:22:31 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:09.325 10:22:31 -- setup/common.sh@46 -- # (( part++ )) 00:04:09.325 10:22:31 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:09.325 10:22:31 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:09.325 10:22:31 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:09.325 10:22:31 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:10.711 Creating new GPT entries in memory. 00:04:10.711 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:10.711 other utilities. 00:04:10.711 10:22:32 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:10.711 10:22:32 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:10.711 10:22:32 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:10.711 10:22:32 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:10.711 10:22:32 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:11.652 Creating new GPT entries in memory. 00:04:11.652 The operation has completed successfully. 00:04:11.652 10:22:33 -- setup/common.sh@57 -- # (( part++ )) 00:04:11.652 10:22:33 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:11.652 10:22:33 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:11.652 10:22:33 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:11.652 10:22:33 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:12.596 The operation has completed successfully. 00:04:12.596 10:22:34 -- setup/common.sh@57 -- # (( part++ )) 00:04:12.596 10:22:34 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:12.596 10:22:34 -- setup/common.sh@62 -- # wait 172994 00:04:12.596 10:22:34 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:12.596 10:22:34 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:12.596 10:22:34 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:12.596 10:22:34 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:12.596 10:22:34 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:12.596 10:22:34 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:12.596 10:22:34 -- setup/devices.sh@161 -- # break 00:04:12.596 10:22:34 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:12.596 10:22:34 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:12.596 10:22:34 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:12.596 10:22:34 -- setup/devices.sh@166 -- # dm=dm-0 00:04:12.596 10:22:34 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:12.596 10:22:34 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:12.596 10:22:34 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:12.596 10:22:34 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:12.596 10:22:34 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:12.596 10:22:34 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:12.596 10:22:34 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:12.596 10:22:34 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:12.596 10:22:34 -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:12.596 10:22:34 -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:12.596 10:22:34 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:12.596 10:22:34 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:12.596 10:22:34 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:12.596 10:22:34 -- setup/devices.sh@53 -- # local found=0 00:04:12.596 10:22:34 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:12.596 10:22:34 -- setup/devices.sh@56 -- # : 00:04:12.596 10:22:34 -- setup/devices.sh@59 -- # local pci status 00:04:12.596 10:22:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:12.596 10:22:34 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:12.596 10:22:34 -- setup/devices.sh@47 -- # setup output config 00:04:12.596 10:22:34 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.596 10:22:34 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:15.900 10:22:37 -- setup/devices.sh@63 -- # found=1 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:af:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:b0:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.900 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.900 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.901 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.901 10:22:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:15.901 10:22:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:16.162 10:22:38 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:16.162 10:22:38 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:16.162 10:22:38 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:16.162 10:22:38 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:16.162 10:22:38 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:16.162 10:22:38 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:16.162 10:22:38 -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:16.162 10:22:38 -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:16.162 10:22:38 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:16.162 10:22:38 -- setup/devices.sh@50 -- # local mount_point= 00:04:16.162 10:22:38 -- setup/devices.sh@51 -- # local test_file= 00:04:16.162 10:22:38 -- setup/devices.sh@53 -- # local found=0 00:04:16.162 10:22:38 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:16.162 10:22:38 -- setup/devices.sh@59 -- # local pci status 00:04:16.162 10:22:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:16.162 10:22:38 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:16.162 10:22:38 -- setup/devices.sh@47 -- # setup output config 00:04:16.162 10:22:38 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.162 10:22:38 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:19.554 10:22:41 -- setup/devices.sh@63 -- # found=1 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:af:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:b0:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.554 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.554 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.555 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.555 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.555 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.555 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.555 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.555 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.555 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.555 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.555 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.555 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.555 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.555 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.555 10:22:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:19.555 10:22:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.819 10:22:41 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:19.819 10:22:41 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:19.819 10:22:41 -- setup/devices.sh@68 -- # return 0 00:04:19.819 10:22:41 -- setup/devices.sh@187 -- # cleanup_dm 00:04:19.819 10:22:41 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:19.819 10:22:41 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:19.819 10:22:41 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:19.819 10:22:41 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:19.819 10:22:41 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:19.819 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:19.819 10:22:41 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:19.819 10:22:41 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:19.819 00:04:19.819 real 0m10.457s 00:04:19.819 user 0m2.739s 00:04:19.819 sys 0m4.815s 00:04:19.819 10:22:41 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:19.819 10:22:41 -- common/autotest_common.sh@10 -- # set +x 00:04:19.819 ************************************ 00:04:19.819 END TEST dm_mount 00:04:19.819 ************************************ 00:04:19.819 10:22:41 -- setup/devices.sh@1 -- # cleanup 00:04:19.819 10:22:41 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:19.819 10:22:41 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:19.819 10:22:41 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:19.819 10:22:41 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:19.819 10:22:41 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:19.819 10:22:41 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:20.107 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:20.107 /dev/nvme0n1: 8 bytes were erased at offset 0x1bf1fc55e00 (gpt): 45 46 49 20 50 41 52 54 00:04:20.107 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:20.107 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:20.107 10:22:42 -- setup/devices.sh@12 -- # cleanup_dm 00:04:20.107 10:22:42 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:20.107 10:22:42 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:20.107 10:22:42 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:20.107 10:22:42 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:20.107 10:22:42 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:20.107 10:22:42 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:20.107 00:04:20.107 real 0m28.781s 00:04:20.107 user 0m8.547s 00:04:20.107 sys 0m15.205s 00:04:20.107 10:22:42 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:20.107 10:22:42 -- common/autotest_common.sh@10 -- # set +x 00:04:20.107 ************************************ 00:04:20.107 END TEST devices 00:04:20.107 ************************************ 00:04:20.383 00:04:20.383 real 1m38.700s 00:04:20.383 user 0m32.364s 00:04:20.383 sys 0m57.806s 00:04:20.383 10:22:42 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:20.383 10:22:42 -- common/autotest_common.sh@10 -- # set +x 00:04:20.383 ************************************ 00:04:20.383 END TEST setup.sh 00:04:20.383 ************************************ 00:04:20.384 10:22:42 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:23.847 Hugepages 00:04:23.847 node hugesize free / total 00:04:23.847 node0 1048576kB 0 / 0 00:04:23.847 node0 2048kB 2048 / 2048 00:04:23.847 node1 1048576kB 0 / 0 00:04:23.847 node1 2048kB 0 / 0 00:04:23.847 00:04:23.847 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:23.847 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:23.847 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:23.847 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:23.847 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:23.847 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:23.847 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:23.847 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:23.847 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:23.847 NVMe 0000:5e:00.0 144d a80a 0 nvme nvme0 nvme0n1 00:04:23.847 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:23.847 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:23.847 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:23.847 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:24.106 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:24.106 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:24.106 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:24.106 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:24.106 NVMe 0000:af:00.0 8086 2701 1 nvme nvme1 nvme1n1 00:04:24.106 NVMe 0000:b0:00.0 8086 4140 1 nvme nvme2 nvme2n1 00:04:24.106 10:22:46 -- spdk/autotest.sh@130 -- # uname -s 00:04:24.106 10:22:46 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:24.106 10:22:46 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:24.106 10:22:46 -- common/autotest_common.sh@1517 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:28.302 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:af:00.0 (8086 2701): nvme -> vfio-pci 00:04:28.302 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:b0:00.0 (8086 4140): nvme -> vfio-pci 00:04:28.302 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:28.302 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:29.681 0000:5e:00.0 (144d a80a): nvme -> vfio-pci 00:04:29.681 10:22:51 -- common/autotest_common.sh@1518 -- # sleep 1 00:04:30.619 10:22:52 -- common/autotest_common.sh@1519 -- # bdfs=() 00:04:30.619 10:22:52 -- common/autotest_common.sh@1519 -- # local bdfs 00:04:30.619 10:22:52 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:04:30.619 10:22:52 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:04:30.619 10:22:52 -- common/autotest_common.sh@1499 -- # bdfs=() 00:04:30.619 10:22:52 -- common/autotest_common.sh@1499 -- # local bdfs 00:04:30.619 10:22:52 -- common/autotest_common.sh@1500 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:30.619 10:22:52 -- common/autotest_common.sh@1500 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:30.619 10:22:52 -- common/autotest_common.sh@1500 -- # jq -r '.config[].params.traddr' 00:04:30.878 10:22:52 -- common/autotest_common.sh@1501 -- # (( 3 == 0 )) 00:04:30.878 10:22:52 -- common/autotest_common.sh@1505 -- # printf '%s\n' 0000:5e:00.0 0000:af:00.0 0000:b0:00.0 00:04:30.878 10:22:52 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:34.170 Waiting for block devices as requested 00:04:34.170 0000:5e:00.0 (144d a80a): vfio-pci -> nvme 00:04:34.430 0000:af:00.0 (8086 2701): vfio-pci -> nvme 00:04:34.430 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:34.689 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:34.690 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:34.690 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:34.949 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:34.949 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:34.949 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:35.209 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:35.209 0000:b0:00.0 (8086 4140): vfio-pci -> nvme 00:04:35.209 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:35.469 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:35.469 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:35.469 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:35.728 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:35.728 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:35.728 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:35.988 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:35.988 10:22:57 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:04:35.988 10:22:57 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:04:35.988 10:22:57 -- common/autotest_common.sh@1488 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 /sys/class/nvme/nvme2 00:04:35.988 10:22:57 -- common/autotest_common.sh@1488 -- # grep 0000:5e:00.0/nvme/nvme 00:04:35.988 10:22:57 -- common/autotest_common.sh@1488 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:04:35.988 10:22:57 -- common/autotest_common.sh@1489 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:04:35.988 10:22:57 -- common/autotest_common.sh@1493 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:04:35.988 10:22:57 -- common/autotest_common.sh@1493 -- # printf '%s\n' nvme0 00:04:35.988 10:22:57 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:04:35.988 10:22:57 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:04:35.988 10:22:57 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:04:35.988 10:22:57 -- common/autotest_common.sh@1531 -- # grep oacs 00:04:35.988 10:22:57 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:04:35.988 10:22:57 -- common/autotest_common.sh@1531 -- # oacs=' 0x5f' 00:04:35.988 10:22:57 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:04:35.988 10:22:57 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:04:35.988 10:22:57 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:04:35.988 10:22:57 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:04:35.988 10:22:57 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:35.988 10:22:58 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:04:35.988 10:22:58 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:04:35.988 10:22:58 -- common/autotest_common.sh@1543 -- # continue 00:04:35.988 10:22:58 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:04:35.988 10:22:58 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:af:00.0 00:04:35.988 10:22:58 -- common/autotest_common.sh@1488 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 /sys/class/nvme/nvme2 00:04:35.988 10:22:58 -- common/autotest_common.sh@1488 -- # grep 0000:af:00.0/nvme/nvme 00:04:35.988 10:22:58 -- common/autotest_common.sh@1488 -- # bdf_sysfs_path=/sys/devices/pci0000:ae/0000:ae:00.0/0000:af:00.0/nvme/nvme1 00:04:35.988 10:22:58 -- common/autotest_common.sh@1489 -- # [[ -z /sys/devices/pci0000:ae/0000:ae:00.0/0000:af:00.0/nvme/nvme1 ]] 00:04:35.988 10:22:58 -- common/autotest_common.sh@1493 -- # basename /sys/devices/pci0000:ae/0000:ae:00.0/0000:af:00.0/nvme/nvme1 00:04:35.988 10:22:58 -- common/autotest_common.sh@1493 -- # printf '%s\n' nvme1 00:04:35.988 10:22:58 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme1 00:04:35.988 10:22:58 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme1 ]] 00:04:35.988 10:22:58 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme1 00:04:35.988 10:22:58 -- common/autotest_common.sh@1531 -- # grep oacs 00:04:35.988 10:22:58 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:04:35.988 10:22:58 -- common/autotest_common.sh@1531 -- # oacs=' 0x7' 00:04:35.988 10:22:58 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=0 00:04:35.988 10:22:58 -- common/autotest_common.sh@1534 -- # [[ 0 -ne 0 ]] 00:04:35.988 10:22:58 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:04:35.988 10:22:58 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:b0:00.0 00:04:35.988 10:22:58 -- common/autotest_common.sh@1488 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 /sys/class/nvme/nvme2 00:04:35.988 10:22:58 -- common/autotest_common.sh@1488 -- # grep 0000:b0:00.0/nvme/nvme 00:04:35.988 10:22:58 -- common/autotest_common.sh@1488 -- # bdf_sysfs_path=/sys/devices/pci0000:ae/0000:ae:02.0/0000:b0:00.0/nvme/nvme2 00:04:35.989 10:22:58 -- common/autotest_common.sh@1489 -- # [[ -z /sys/devices/pci0000:ae/0000:ae:02.0/0000:b0:00.0/nvme/nvme2 ]] 00:04:35.989 10:22:58 -- common/autotest_common.sh@1493 -- # basename /sys/devices/pci0000:ae/0000:ae:02.0/0000:b0:00.0/nvme/nvme2 00:04:35.989 10:22:58 -- common/autotest_common.sh@1493 -- # printf '%s\n' nvme2 00:04:35.989 10:22:58 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme2 00:04:35.989 10:22:58 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme2 ]] 00:04:35.989 10:22:58 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme2 00:04:35.989 10:22:58 -- common/autotest_common.sh@1531 -- # grep oacs 00:04:35.989 10:22:58 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:04:35.989 10:22:58 -- common/autotest_common.sh@1531 -- # oacs=' 0x1e' 00:04:35.989 10:22:58 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:04:35.989 10:22:58 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:04:35.989 10:22:58 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme2 00:04:35.989 10:22:58 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:04:35.989 10:22:58 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:35.989 10:22:58 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:04:35.989 10:22:58 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:04:35.989 10:22:58 -- common/autotest_common.sh@1543 -- # continue 00:04:35.989 10:22:58 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:04:35.989 10:22:58 -- common/autotest_common.sh@716 -- # xtrace_disable 00:04:35.989 10:22:58 -- common/autotest_common.sh@10 -- # set +x 00:04:36.248 10:22:58 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:04:36.248 10:22:58 -- common/autotest_common.sh@710 -- # xtrace_disable 00:04:36.248 10:22:58 -- common/autotest_common.sh@10 -- # set +x 00:04:36.248 10:22:58 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:39.541 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:39.541 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:af:00.0 (8086 2701): nvme -> vfio-pci 00:04:39.800 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:5e:00.0 (144d a80a): nvme -> vfio-pci 00:04:39.800 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:b0:00.0 (8086 4140): nvme -> vfio-pci 00:04:39.800 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:39.800 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:40.060 10:23:01 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:04:40.060 10:23:01 -- common/autotest_common.sh@716 -- # xtrace_disable 00:04:40.060 10:23:01 -- common/autotest_common.sh@10 -- # set +x 00:04:40.060 10:23:02 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:04:40.060 10:23:02 -- common/autotest_common.sh@1577 -- # mapfile -t bdfs 00:04:40.060 10:23:02 -- common/autotest_common.sh@1577 -- # get_nvme_bdfs_by_id 0x0a54 00:04:40.060 10:23:02 -- common/autotest_common.sh@1563 -- # bdfs=() 00:04:40.060 10:23:02 -- common/autotest_common.sh@1563 -- # local bdfs 00:04:40.060 10:23:02 -- common/autotest_common.sh@1565 -- # get_nvme_bdfs 00:04:40.060 10:23:02 -- common/autotest_common.sh@1499 -- # bdfs=() 00:04:40.060 10:23:02 -- common/autotest_common.sh@1499 -- # local bdfs 00:04:40.060 10:23:02 -- common/autotest_common.sh@1500 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:40.060 10:23:02 -- common/autotest_common.sh@1500 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:40.060 10:23:02 -- common/autotest_common.sh@1500 -- # jq -r '.config[].params.traddr' 00:04:40.320 10:23:02 -- common/autotest_common.sh@1501 -- # (( 3 == 0 )) 00:04:40.320 10:23:02 -- common/autotest_common.sh@1505 -- # printf '%s\n' 0000:5e:00.0 0000:af:00.0 0000:b0:00.0 00:04:40.320 10:23:02 -- common/autotest_common.sh@1565 -- # for bdf in $(get_nvme_bdfs) 00:04:40.320 10:23:02 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:04:40.320 10:23:02 -- common/autotest_common.sh@1566 -- # device=0xa80a 00:04:40.320 10:23:02 -- common/autotest_common.sh@1567 -- # [[ 0xa80a == \0\x\0\a\5\4 ]] 00:04:40.320 10:23:02 -- common/autotest_common.sh@1565 -- # for bdf in $(get_nvme_bdfs) 00:04:40.320 10:23:02 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:af:00.0/device 00:04:40.320 10:23:02 -- common/autotest_common.sh@1566 -- # device=0x2701 00:04:40.320 10:23:02 -- common/autotest_common.sh@1567 -- # [[ 0x2701 == \0\x\0\a\5\4 ]] 00:04:40.320 10:23:02 -- common/autotest_common.sh@1565 -- # for bdf in $(get_nvme_bdfs) 00:04:40.320 10:23:02 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:b0:00.0/device 00:04:40.320 10:23:02 -- common/autotest_common.sh@1566 -- # device=0x4140 00:04:40.320 10:23:02 -- common/autotest_common.sh@1567 -- # [[ 0x4140 == \0\x\0\a\5\4 ]] 00:04:40.320 10:23:02 -- common/autotest_common.sh@1572 -- # printf '%s\n' 00:04:40.320 10:23:02 -- common/autotest_common.sh@1578 -- # [[ -z '' ]] 00:04:40.320 10:23:02 -- common/autotest_common.sh@1579 -- # return 0 00:04:40.320 10:23:02 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:04:40.320 10:23:02 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:04:40.320 10:23:02 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:04:40.320 10:23:02 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:04:40.320 10:23:02 -- spdk/autotest.sh@162 -- # timing_enter lib 00:04:40.320 10:23:02 -- common/autotest_common.sh@710 -- # xtrace_disable 00:04:40.320 10:23:02 -- common/autotest_common.sh@10 -- # set +x 00:04:40.320 10:23:02 -- spdk/autotest.sh@164 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:40.320 10:23:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.320 10:23:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.320 10:23:02 -- common/autotest_common.sh@10 -- # set +x 00:04:40.320 ************************************ 00:04:40.320 START TEST env 00:04:40.320 ************************************ 00:04:40.320 10:23:02 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:40.580 * Looking for test storage... 00:04:40.580 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:04:40.580 10:23:02 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:40.580 10:23:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.580 10:23:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.580 10:23:02 -- common/autotest_common.sh@10 -- # set +x 00:04:40.580 ************************************ 00:04:40.580 START TEST env_memory 00:04:40.580 ************************************ 00:04:40.580 10:23:02 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:40.580 00:04:40.580 00:04:40.580 CUnit - A unit testing framework for C - Version 2.1-3 00:04:40.580 http://cunit.sourceforge.net/ 00:04:40.580 00:04:40.580 00:04:40.580 Suite: memory 00:04:40.580 Test: alloc and free memory map ...[2024-04-19 10:23:02.599943] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:04:40.580 passed 00:04:40.580 Test: mem map translation ...[2024-04-19 10:23:02.612961] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:04:40.580 [2024-04-19 10:23:02.612978] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:04:40.580 [2024-04-19 10:23:02.613006] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:04:40.580 [2024-04-19 10:23:02.613014] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:04:40.580 passed 00:04:40.580 Test: mem map registration ...[2024-04-19 10:23:02.634641] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:04:40.580 [2024-04-19 10:23:02.634658] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:04:40.580 passed 00:04:40.580 Test: mem map adjacent registrations ...passed 00:04:40.580 00:04:40.580 Run Summary: Type Total Ran Passed Failed Inactive 00:04:40.580 suites 1 1 n/a 0 0 00:04:40.580 tests 4 4 4 0 0 00:04:40.580 asserts 152 152 152 0 n/a 00:04:40.580 00:04:40.580 Elapsed time = 0.076 seconds 00:04:40.580 00:04:40.580 real 0m0.089s 00:04:40.580 user 0m0.076s 00:04:40.580 sys 0m0.012s 00:04:40.580 10:23:02 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:40.580 10:23:02 -- common/autotest_common.sh@10 -- # set +x 00:04:40.580 ************************************ 00:04:40.580 END TEST env_memory 00:04:40.580 ************************************ 00:04:40.841 10:23:02 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:40.841 10:23:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.841 10:23:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.841 10:23:02 -- common/autotest_common.sh@10 -- # set +x 00:04:40.841 ************************************ 00:04:40.841 START TEST env_vtophys 00:04:40.841 ************************************ 00:04:40.841 10:23:02 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:40.841 EAL: lib.eal log level changed from notice to debug 00:04:40.841 EAL: Detected lcore 0 as core 0 on socket 0 00:04:40.841 EAL: Detected lcore 1 as core 1 on socket 0 00:04:40.841 EAL: Detected lcore 2 as core 2 on socket 0 00:04:40.841 EAL: Detected lcore 3 as core 3 on socket 0 00:04:40.841 EAL: Detected lcore 4 as core 4 on socket 0 00:04:40.841 EAL: Detected lcore 5 as core 8 on socket 0 00:04:40.841 EAL: Detected lcore 6 as core 9 on socket 0 00:04:40.841 EAL: Detected lcore 7 as core 10 on socket 0 00:04:40.841 EAL: Detected lcore 8 as core 11 on socket 0 00:04:40.841 EAL: Detected lcore 9 as core 16 on socket 0 00:04:40.841 EAL: Detected lcore 10 as core 17 on socket 0 00:04:40.841 EAL: Detected lcore 11 as core 18 on socket 0 00:04:40.841 EAL: Detected lcore 12 as core 19 on socket 0 00:04:40.841 EAL: Detected lcore 13 as core 20 on socket 0 00:04:40.841 EAL: Detected lcore 14 as core 24 on socket 0 00:04:40.841 EAL: Detected lcore 15 as core 25 on socket 0 00:04:40.841 EAL: Detected lcore 16 as core 26 on socket 0 00:04:40.841 EAL: Detected lcore 17 as core 27 on socket 0 00:04:40.841 EAL: Detected lcore 18 as core 0 on socket 1 00:04:40.841 EAL: Detected lcore 19 as core 1 on socket 1 00:04:40.841 EAL: Detected lcore 20 as core 2 on socket 1 00:04:40.841 EAL: Detected lcore 21 as core 3 on socket 1 00:04:40.841 EAL: Detected lcore 22 as core 4 on socket 1 00:04:40.841 EAL: Detected lcore 23 as core 8 on socket 1 00:04:40.841 EAL: Detected lcore 24 as core 9 on socket 1 00:04:40.841 EAL: Detected lcore 25 as core 10 on socket 1 00:04:40.841 EAL: Detected lcore 26 as core 11 on socket 1 00:04:40.841 EAL: Detected lcore 27 as core 16 on socket 1 00:04:40.841 EAL: Detected lcore 28 as core 17 on socket 1 00:04:40.841 EAL: Detected lcore 29 as core 18 on socket 1 00:04:40.841 EAL: Detected lcore 30 as core 19 on socket 1 00:04:40.841 EAL: Detected lcore 31 as core 20 on socket 1 00:04:40.841 EAL: Detected lcore 32 as core 24 on socket 1 00:04:40.841 EAL: Detected lcore 33 as core 25 on socket 1 00:04:40.841 EAL: Detected lcore 34 as core 26 on socket 1 00:04:40.841 EAL: Detected lcore 35 as core 27 on socket 1 00:04:40.841 EAL: Detected lcore 36 as core 0 on socket 0 00:04:40.841 EAL: Detected lcore 37 as core 1 on socket 0 00:04:40.841 EAL: Detected lcore 38 as core 2 on socket 0 00:04:40.841 EAL: Detected lcore 39 as core 3 on socket 0 00:04:40.841 EAL: Detected lcore 40 as core 4 on socket 0 00:04:40.841 EAL: Detected lcore 41 as core 8 on socket 0 00:04:40.841 EAL: Detected lcore 42 as core 9 on socket 0 00:04:40.841 EAL: Detected lcore 43 as core 10 on socket 0 00:04:40.841 EAL: Detected lcore 44 as core 11 on socket 0 00:04:40.841 EAL: Detected lcore 45 as core 16 on socket 0 00:04:40.841 EAL: Detected lcore 46 as core 17 on socket 0 00:04:40.841 EAL: Detected lcore 47 as core 18 on socket 0 00:04:40.841 EAL: Detected lcore 48 as core 19 on socket 0 00:04:40.841 EAL: Detected lcore 49 as core 20 on socket 0 00:04:40.841 EAL: Detected lcore 50 as core 24 on socket 0 00:04:40.841 EAL: Detected lcore 51 as core 25 on socket 0 00:04:40.841 EAL: Detected lcore 52 as core 26 on socket 0 00:04:40.841 EAL: Detected lcore 53 as core 27 on socket 0 00:04:40.841 EAL: Detected lcore 54 as core 0 on socket 1 00:04:40.841 EAL: Detected lcore 55 as core 1 on socket 1 00:04:40.841 EAL: Detected lcore 56 as core 2 on socket 1 00:04:40.841 EAL: Detected lcore 57 as core 3 on socket 1 00:04:40.841 EAL: Detected lcore 58 as core 4 on socket 1 00:04:40.841 EAL: Detected lcore 59 as core 8 on socket 1 00:04:40.841 EAL: Detected lcore 60 as core 9 on socket 1 00:04:40.841 EAL: Detected lcore 61 as core 10 on socket 1 00:04:40.841 EAL: Detected lcore 62 as core 11 on socket 1 00:04:40.841 EAL: Detected lcore 63 as core 16 on socket 1 00:04:40.841 EAL: Detected lcore 64 as core 17 on socket 1 00:04:40.841 EAL: Detected lcore 65 as core 18 on socket 1 00:04:40.841 EAL: Detected lcore 66 as core 19 on socket 1 00:04:40.841 EAL: Detected lcore 67 as core 20 on socket 1 00:04:40.841 EAL: Detected lcore 68 as core 24 on socket 1 00:04:40.841 EAL: Detected lcore 69 as core 25 on socket 1 00:04:40.841 EAL: Detected lcore 70 as core 26 on socket 1 00:04:40.841 EAL: Detected lcore 71 as core 27 on socket 1 00:04:40.841 EAL: Maximum logical cores by configuration: 128 00:04:40.841 EAL: Detected CPU lcores: 72 00:04:40.841 EAL: Detected NUMA nodes: 2 00:04:40.841 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:04:40.841 EAL: Checking presence of .so 'librte_eal.so.24' 00:04:40.841 EAL: Checking presence of .so 'librte_eal.so' 00:04:40.841 EAL: Detected static linkage of DPDK 00:04:40.842 EAL: No shared files mode enabled, IPC will be disabled 00:04:40.842 EAL: Bus pci wants IOVA as 'DC' 00:04:40.842 EAL: Buses did not request a specific IOVA mode. 00:04:40.842 EAL: IOMMU is available, selecting IOVA as VA mode. 00:04:40.842 EAL: Selected IOVA mode 'VA' 00:04:40.842 EAL: No free 2048 kB hugepages reported on node 1 00:04:40.842 EAL: Probing VFIO support... 00:04:40.842 EAL: IOMMU type 1 (Type 1) is supported 00:04:40.842 EAL: IOMMU type 7 (sPAPR) is not supported 00:04:40.842 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:04:40.842 EAL: VFIO support initialized 00:04:40.842 EAL: Ask a virtual area of 0x2e000 bytes 00:04:40.842 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:04:40.842 EAL: Setting up physically contiguous memory... 00:04:40.842 EAL: Setting maximum number of open files to 524288 00:04:40.842 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:04:40.842 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:04:40.842 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:04:40.842 EAL: Ask a virtual area of 0x61000 bytes 00:04:40.842 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:04:40.842 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:40.842 EAL: Ask a virtual area of 0x400000000 bytes 00:04:40.842 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:04:40.842 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:04:40.842 EAL: Ask a virtual area of 0x61000 bytes 00:04:40.842 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:04:40.842 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:40.842 EAL: Ask a virtual area of 0x400000000 bytes 00:04:40.842 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:04:40.842 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:04:40.842 EAL: Ask a virtual area of 0x61000 bytes 00:04:40.842 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:04:40.842 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:40.842 EAL: Ask a virtual area of 0x400000000 bytes 00:04:40.842 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:04:40.842 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:04:40.842 EAL: Ask a virtual area of 0x61000 bytes 00:04:40.842 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:04:40.842 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:40.842 EAL: Ask a virtual area of 0x400000000 bytes 00:04:40.842 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:04:40.842 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:04:40.842 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:04:40.842 EAL: Ask a virtual area of 0x61000 bytes 00:04:40.842 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:04:40.842 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:40.842 EAL: Ask a virtual area of 0x400000000 bytes 00:04:40.842 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:04:40.842 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:04:40.842 EAL: Ask a virtual area of 0x61000 bytes 00:04:40.842 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:04:40.842 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:40.842 EAL: Ask a virtual area of 0x400000000 bytes 00:04:40.842 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:04:40.842 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:04:40.842 EAL: Ask a virtual area of 0x61000 bytes 00:04:40.842 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:04:40.842 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:40.842 EAL: Ask a virtual area of 0x400000000 bytes 00:04:40.842 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:04:40.842 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:04:40.842 EAL: Ask a virtual area of 0x61000 bytes 00:04:40.842 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:04:40.842 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:40.842 EAL: Ask a virtual area of 0x400000000 bytes 00:04:40.842 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:04:40.842 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:04:40.842 EAL: Hugepages will be freed exactly as allocated. 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: TSC frequency is ~2300000 KHz 00:04:40.842 EAL: Main lcore 0 is ready (tid=7f26a1f3ba00;cpuset=[0]) 00:04:40.842 EAL: Trying to obtain current memory policy. 00:04:40.842 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:40.842 EAL: Restoring previous memory policy: 0 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was expanded by 2MB 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Mem event callback 'spdk:(nil)' registered 00:04:40.842 00:04:40.842 00:04:40.842 CUnit - A unit testing framework for C - Version 2.1-3 00:04:40.842 http://cunit.sourceforge.net/ 00:04:40.842 00:04:40.842 00:04:40.842 Suite: components_suite 00:04:40.842 Test: vtophys_malloc_test ...passed 00:04:40.842 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:04:40.842 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:40.842 EAL: Restoring previous memory policy: 4 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was expanded by 4MB 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was shrunk by 4MB 00:04:40.842 EAL: Trying to obtain current memory policy. 00:04:40.842 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:40.842 EAL: Restoring previous memory policy: 4 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was expanded by 6MB 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was shrunk by 6MB 00:04:40.842 EAL: Trying to obtain current memory policy. 00:04:40.842 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:40.842 EAL: Restoring previous memory policy: 4 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was expanded by 10MB 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was shrunk by 10MB 00:04:40.842 EAL: Trying to obtain current memory policy. 00:04:40.842 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:40.842 EAL: Restoring previous memory policy: 4 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was expanded by 18MB 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was shrunk by 18MB 00:04:40.842 EAL: Trying to obtain current memory policy. 00:04:40.842 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:40.842 EAL: Restoring previous memory policy: 4 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:40.842 EAL: request: mp_malloc_sync 00:04:40.842 EAL: No shared files mode enabled, IPC is disabled 00:04:40.842 EAL: Heap on socket 0 was expanded by 34MB 00:04:40.842 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.102 EAL: request: mp_malloc_sync 00:04:41.102 EAL: No shared files mode enabled, IPC is disabled 00:04:41.102 EAL: Heap on socket 0 was shrunk by 34MB 00:04:41.102 EAL: Trying to obtain current memory policy. 00:04:41.102 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:41.102 EAL: Restoring previous memory policy: 4 00:04:41.102 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.102 EAL: request: mp_malloc_sync 00:04:41.102 EAL: No shared files mode enabled, IPC is disabled 00:04:41.102 EAL: Heap on socket 0 was expanded by 66MB 00:04:41.102 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.102 EAL: request: mp_malloc_sync 00:04:41.102 EAL: No shared files mode enabled, IPC is disabled 00:04:41.102 EAL: Heap on socket 0 was shrunk by 66MB 00:04:41.102 EAL: Trying to obtain current memory policy. 00:04:41.102 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:41.102 EAL: Restoring previous memory policy: 4 00:04:41.102 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.102 EAL: request: mp_malloc_sync 00:04:41.102 EAL: No shared files mode enabled, IPC is disabled 00:04:41.102 EAL: Heap on socket 0 was expanded by 130MB 00:04:41.102 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.102 EAL: request: mp_malloc_sync 00:04:41.102 EAL: No shared files mode enabled, IPC is disabled 00:04:41.102 EAL: Heap on socket 0 was shrunk by 130MB 00:04:41.103 EAL: Trying to obtain current memory policy. 00:04:41.103 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:41.103 EAL: Restoring previous memory policy: 4 00:04:41.103 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.103 EAL: request: mp_malloc_sync 00:04:41.103 EAL: No shared files mode enabled, IPC is disabled 00:04:41.103 EAL: Heap on socket 0 was expanded by 258MB 00:04:41.103 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.103 EAL: request: mp_malloc_sync 00:04:41.103 EAL: No shared files mode enabled, IPC is disabled 00:04:41.103 EAL: Heap on socket 0 was shrunk by 258MB 00:04:41.103 EAL: Trying to obtain current memory policy. 00:04:41.103 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:41.362 EAL: Restoring previous memory policy: 4 00:04:41.362 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.362 EAL: request: mp_malloc_sync 00:04:41.362 EAL: No shared files mode enabled, IPC is disabled 00:04:41.362 EAL: Heap on socket 0 was expanded by 514MB 00:04:41.362 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.362 EAL: request: mp_malloc_sync 00:04:41.362 EAL: No shared files mode enabled, IPC is disabled 00:04:41.362 EAL: Heap on socket 0 was shrunk by 514MB 00:04:41.362 EAL: Trying to obtain current memory policy. 00:04:41.362 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:41.621 EAL: Restoring previous memory policy: 4 00:04:41.621 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.621 EAL: request: mp_malloc_sync 00:04:41.621 EAL: No shared files mode enabled, IPC is disabled 00:04:41.621 EAL: Heap on socket 0 was expanded by 1026MB 00:04:41.881 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.881 EAL: request: mp_malloc_sync 00:04:41.881 EAL: No shared files mode enabled, IPC is disabled 00:04:41.881 EAL: Heap on socket 0 was shrunk by 1026MB 00:04:41.881 passed 00:04:41.881 00:04:41.881 Run Summary: Type Total Ran Passed Failed Inactive 00:04:41.881 suites 1 1 n/a 0 0 00:04:41.881 tests 2 2 2 0 0 00:04:41.881 asserts 497 497 497 0 n/a 00:04:41.881 00:04:41.881 Elapsed time = 0.990 seconds 00:04:41.881 EAL: Calling mem event callback 'spdk:(nil)' 00:04:41.881 EAL: request: mp_malloc_sync 00:04:41.881 EAL: No shared files mode enabled, IPC is disabled 00:04:41.881 EAL: Heap on socket 0 was shrunk by 2MB 00:04:41.881 EAL: No shared files mode enabled, IPC is disabled 00:04:41.881 EAL: No shared files mode enabled, IPC is disabled 00:04:41.881 EAL: No shared files mode enabled, IPC is disabled 00:04:41.881 00:04:41.881 real 0m1.131s 00:04:41.881 user 0m0.636s 00:04:41.881 sys 0m0.466s 00:04:41.881 10:23:03 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:41.881 10:23:03 -- common/autotest_common.sh@10 -- # set +x 00:04:41.881 ************************************ 00:04:41.881 END TEST env_vtophys 00:04:41.881 ************************************ 00:04:42.141 10:23:03 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:42.141 10:23:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:42.141 10:23:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:42.141 10:23:03 -- common/autotest_common.sh@10 -- # set +x 00:04:42.141 ************************************ 00:04:42.141 START TEST env_pci 00:04:42.141 ************************************ 00:04:42.141 10:23:04 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:42.141 00:04:42.141 00:04:42.141 CUnit - A unit testing framework for C - Version 2.1-3 00:04:42.141 http://cunit.sourceforge.net/ 00:04:42.141 00:04:42.141 00:04:42.141 Suite: pci 00:04:42.141 Test: pci_hook ...[2024-04-19 10:23:04.139668] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 182276 has claimed it 00:04:42.141 EAL: Cannot find device (10000:00:01.0) 00:04:42.141 EAL: Failed to attach device on primary process 00:04:42.141 passed 00:04:42.141 00:04:42.141 Run Summary: Type Total Ran Passed Failed Inactive 00:04:42.141 suites 1 1 n/a 0 0 00:04:42.141 tests 1 1 1 0 0 00:04:42.141 asserts 25 25 25 0 n/a 00:04:42.141 00:04:42.141 Elapsed time = 0.036 seconds 00:04:42.141 00:04:42.141 real 0m0.057s 00:04:42.141 user 0m0.013s 00:04:42.141 sys 0m0.043s 00:04:42.141 10:23:04 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:42.141 10:23:04 -- common/autotest_common.sh@10 -- # set +x 00:04:42.141 ************************************ 00:04:42.141 END TEST env_pci 00:04:42.141 ************************************ 00:04:42.141 10:23:04 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:04:42.141 10:23:04 -- env/env.sh@15 -- # uname 00:04:42.141 10:23:04 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:04:42.141 10:23:04 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:04:42.141 10:23:04 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:42.141 10:23:04 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:04:42.141 10:23:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:42.141 10:23:04 -- common/autotest_common.sh@10 -- # set +x 00:04:42.400 ************************************ 00:04:42.400 START TEST env_dpdk_post_init 00:04:42.400 ************************************ 00:04:42.400 10:23:04 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:42.400 EAL: Detected CPU lcores: 72 00:04:42.400 EAL: Detected NUMA nodes: 2 00:04:42.400 EAL: Detected static linkage of DPDK 00:04:42.400 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:42.400 EAL: Selected IOVA mode 'VA' 00:04:42.400 EAL: No free 2048 kB hugepages reported on node 1 00:04:42.400 EAL: VFIO support initialized 00:04:42.400 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:42.400 EAL: Using IOMMU type 1 (Type 1) 00:04:42.660 EAL: Probe PCI driver: spdk_nvme (144d:a80a) device: 0000:5e:00.0 (socket 0) 00:04:42.919 EAL: Probe PCI driver: spdk_nvme (8086:2701) device: 0000:af:00.0 (socket 1) 00:04:43.178 EAL: Probe PCI driver: spdk_nvme (8086:4140) device: 0000:b0:00.0 (socket 1) 00:04:43.178 EAL: Releasing PCI mapped resource for 0000:af:00.0 00:04:43.178 EAL: Calling pci_unmap_resource for 0000:af:00.0 at 0x202001004000 00:04:43.178 EAL: Releasing PCI mapped resource for 0000:b0:00.0 00:04:43.178 EAL: Calling pci_unmap_resource for 0000:b0:00.0 at 0x202001008000 00:04:43.438 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:04:43.438 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001000000 00:04:43.438 Starting DPDK initialization... 00:04:43.438 Starting SPDK post initialization... 00:04:43.438 SPDK NVMe probe 00:04:43.438 Attaching to 0000:5e:00.0 00:04:43.438 Attaching to 0000:af:00.0 00:04:43.438 Attaching to 0000:b0:00.0 00:04:43.438 Attached to 0000:af:00.0 00:04:43.438 Attached to 0000:b0:00.0 00:04:43.438 Attached to 0000:5e:00.0 00:04:43.438 Cleaning up... 00:04:43.438 00:04:43.438 real 0m1.139s 00:04:43.438 user 0m0.363s 00:04:43.438 sys 0m0.097s 00:04:43.438 10:23:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:43.438 10:23:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.438 ************************************ 00:04:43.438 END TEST env_dpdk_post_init 00:04:43.438 ************************************ 00:04:43.438 10:23:05 -- env/env.sh@26 -- # uname 00:04:43.438 10:23:05 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:04:43.438 10:23:05 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:43.438 10:23:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:43.438 10:23:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:43.438 10:23:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.698 ************************************ 00:04:43.698 START TEST env_mem_callbacks 00:04:43.698 ************************************ 00:04:43.698 10:23:05 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:43.698 EAL: Detected CPU lcores: 72 00:04:43.698 EAL: Detected NUMA nodes: 2 00:04:43.698 EAL: Detected static linkage of DPDK 00:04:43.698 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:43.698 EAL: Selected IOVA mode 'VA' 00:04:43.698 EAL: No free 2048 kB hugepages reported on node 1 00:04:43.698 EAL: VFIO support initialized 00:04:43.698 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:43.698 00:04:43.698 00:04:43.698 CUnit - A unit testing framework for C - Version 2.1-3 00:04:43.698 http://cunit.sourceforge.net/ 00:04:43.698 00:04:43.698 00:04:43.698 Suite: memory 00:04:43.698 Test: test ... 00:04:43.698 register 0x200000200000 2097152 00:04:43.698 malloc 3145728 00:04:43.698 register 0x200000400000 4194304 00:04:43.698 buf 0x200000500000 len 3145728 PASSED 00:04:43.698 malloc 64 00:04:43.698 buf 0x2000004fff40 len 64 PASSED 00:04:43.698 malloc 4194304 00:04:43.698 register 0x200000800000 6291456 00:04:43.698 buf 0x200000a00000 len 4194304 PASSED 00:04:43.698 free 0x200000500000 3145728 00:04:43.698 free 0x2000004fff40 64 00:04:43.698 unregister 0x200000400000 4194304 PASSED 00:04:43.698 free 0x200000a00000 4194304 00:04:43.698 unregister 0x200000800000 6291456 PASSED 00:04:43.698 malloc 8388608 00:04:43.698 register 0x200000400000 10485760 00:04:43.698 buf 0x200000600000 len 8388608 PASSED 00:04:43.698 free 0x200000600000 8388608 00:04:43.698 unregister 0x200000400000 10485760 PASSED 00:04:43.698 passed 00:04:43.698 00:04:43.698 Run Summary: Type Total Ran Passed Failed Inactive 00:04:43.698 suites 1 1 n/a 0 0 00:04:43.698 tests 1 1 1 0 0 00:04:43.698 asserts 15 15 15 0 n/a 00:04:43.698 00:04:43.698 Elapsed time = 0.009 seconds 00:04:43.698 00:04:43.698 real 0m0.075s 00:04:43.698 user 0m0.019s 00:04:43.698 sys 0m0.056s 00:04:43.698 10:23:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:43.698 10:23:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.698 ************************************ 00:04:43.698 END TEST env_mem_callbacks 00:04:43.698 ************************************ 00:04:43.698 00:04:43.698 real 0m3.445s 00:04:43.698 user 0m1.446s 00:04:43.698 sys 0m1.244s 00:04:43.698 10:23:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:43.698 10:23:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.699 ************************************ 00:04:43.699 END TEST env 00:04:43.699 ************************************ 00:04:43.958 10:23:05 -- spdk/autotest.sh@165 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:43.958 10:23:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:43.958 10:23:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:43.958 10:23:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.958 ************************************ 00:04:43.958 START TEST rpc 00:04:43.958 ************************************ 00:04:43.958 10:23:05 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:43.958 * Looking for test storage... 00:04:43.958 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:44.218 10:23:06 -- rpc/rpc.sh@65 -- # spdk_pid=182578 00:04:44.218 10:23:06 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:44.218 10:23:06 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:04:44.218 10:23:06 -- rpc/rpc.sh@67 -- # waitforlisten 182578 00:04:44.218 10:23:06 -- common/autotest_common.sh@817 -- # '[' -z 182578 ']' 00:04:44.218 10:23:06 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:44.218 10:23:06 -- common/autotest_common.sh@822 -- # local max_retries=100 00:04:44.218 10:23:06 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:44.218 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:44.218 10:23:06 -- common/autotest_common.sh@826 -- # xtrace_disable 00:04:44.218 10:23:06 -- common/autotest_common.sh@10 -- # set +x 00:04:44.218 [2024-04-19 10:23:06.094186] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:04:44.218 [2024-04-19 10:23:06.094264] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid182578 ] 00:04:44.218 EAL: No free 2048 kB hugepages reported on node 1 00:04:44.218 [2024-04-19 10:23:06.179969] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:44.218 [2024-04-19 10:23:06.264547] app.c: 523:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:04:44.218 [2024-04-19 10:23:06.264585] app.c: 527:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 182578' to capture a snapshot of events at runtime. 00:04:44.218 [2024-04-19 10:23:06.264594] app.c: 529:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:04:44.218 [2024-04-19 10:23:06.264603] app.c: 530:app_setup_trace: *NOTICE*: SPDK application currently running. 00:04:44.218 [2024-04-19 10:23:06.264610] app.c: 531:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid182578 for offline analysis/debug. 00:04:44.218 [2024-04-19 10:23:06.264637] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:45.157 10:23:06 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:04:45.157 10:23:06 -- common/autotest_common.sh@850 -- # return 0 00:04:45.157 10:23:06 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:45.157 10:23:06 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:45.157 10:23:06 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:04:45.157 10:23:06 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:04:45.157 10:23:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:45.157 10:23:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:45.158 10:23:06 -- common/autotest_common.sh@10 -- # set +x 00:04:45.158 ************************************ 00:04:45.158 START TEST rpc_integrity 00:04:45.158 ************************************ 00:04:45.158 10:23:07 -- common/autotest_common.sh@1111 -- # rpc_integrity 00:04:45.158 10:23:07 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:45.158 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.158 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.158 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.158 10:23:07 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:45.158 10:23:07 -- rpc/rpc.sh@13 -- # jq length 00:04:45.158 10:23:07 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:45.158 10:23:07 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:45.158 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.158 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.158 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.158 10:23:07 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:04:45.158 10:23:07 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:45.158 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.158 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.158 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.158 10:23:07 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:45.158 { 00:04:45.158 "name": "Malloc0", 00:04:45.158 "aliases": [ 00:04:45.158 "52ebdaff-5efb-4725-829a-bd37d6bb24e9" 00:04:45.158 ], 00:04:45.158 "product_name": "Malloc disk", 00:04:45.158 "block_size": 512, 00:04:45.158 "num_blocks": 16384, 00:04:45.158 "uuid": "52ebdaff-5efb-4725-829a-bd37d6bb24e9", 00:04:45.158 "assigned_rate_limits": { 00:04:45.158 "rw_ios_per_sec": 0, 00:04:45.158 "rw_mbytes_per_sec": 0, 00:04:45.158 "r_mbytes_per_sec": 0, 00:04:45.158 "w_mbytes_per_sec": 0 00:04:45.158 }, 00:04:45.158 "claimed": false, 00:04:45.158 "zoned": false, 00:04:45.158 "supported_io_types": { 00:04:45.158 "read": true, 00:04:45.158 "write": true, 00:04:45.158 "unmap": true, 00:04:45.158 "write_zeroes": true, 00:04:45.158 "flush": true, 00:04:45.158 "reset": true, 00:04:45.158 "compare": false, 00:04:45.158 "compare_and_write": false, 00:04:45.158 "abort": true, 00:04:45.158 "nvme_admin": false, 00:04:45.158 "nvme_io": false 00:04:45.158 }, 00:04:45.158 "memory_domains": [ 00:04:45.158 { 00:04:45.158 "dma_device_id": "system", 00:04:45.158 "dma_device_type": 1 00:04:45.158 }, 00:04:45.158 { 00:04:45.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:45.158 "dma_device_type": 2 00:04:45.158 } 00:04:45.158 ], 00:04:45.158 "driver_specific": {} 00:04:45.158 } 00:04:45.158 ]' 00:04:45.158 10:23:07 -- rpc/rpc.sh@17 -- # jq length 00:04:45.158 10:23:07 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:45.158 10:23:07 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:04:45.158 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.158 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.158 [2024-04-19 10:23:07.167976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:04:45.158 [2024-04-19 10:23:07.168010] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:45.158 [2024-04-19 10:23:07.168026] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x585e840 00:04:45.158 [2024-04-19 10:23:07.168035] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:45.158 [2024-04-19 10:23:07.168829] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:45.158 [2024-04-19 10:23:07.168853] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:45.158 Passthru0 00:04:45.158 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.158 10:23:07 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:45.158 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.158 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.158 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.158 10:23:07 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:45.158 { 00:04:45.158 "name": "Malloc0", 00:04:45.158 "aliases": [ 00:04:45.158 "52ebdaff-5efb-4725-829a-bd37d6bb24e9" 00:04:45.158 ], 00:04:45.158 "product_name": "Malloc disk", 00:04:45.158 "block_size": 512, 00:04:45.158 "num_blocks": 16384, 00:04:45.158 "uuid": "52ebdaff-5efb-4725-829a-bd37d6bb24e9", 00:04:45.158 "assigned_rate_limits": { 00:04:45.158 "rw_ios_per_sec": 0, 00:04:45.158 "rw_mbytes_per_sec": 0, 00:04:45.158 "r_mbytes_per_sec": 0, 00:04:45.158 "w_mbytes_per_sec": 0 00:04:45.158 }, 00:04:45.158 "claimed": true, 00:04:45.158 "claim_type": "exclusive_write", 00:04:45.158 "zoned": false, 00:04:45.158 "supported_io_types": { 00:04:45.158 "read": true, 00:04:45.158 "write": true, 00:04:45.158 "unmap": true, 00:04:45.158 "write_zeroes": true, 00:04:45.158 "flush": true, 00:04:45.158 "reset": true, 00:04:45.158 "compare": false, 00:04:45.158 "compare_and_write": false, 00:04:45.158 "abort": true, 00:04:45.158 "nvme_admin": false, 00:04:45.158 "nvme_io": false 00:04:45.158 }, 00:04:45.158 "memory_domains": [ 00:04:45.158 { 00:04:45.158 "dma_device_id": "system", 00:04:45.158 "dma_device_type": 1 00:04:45.158 }, 00:04:45.158 { 00:04:45.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:45.158 "dma_device_type": 2 00:04:45.158 } 00:04:45.158 ], 00:04:45.158 "driver_specific": {} 00:04:45.158 }, 00:04:45.158 { 00:04:45.158 "name": "Passthru0", 00:04:45.158 "aliases": [ 00:04:45.158 "d5f06ff3-5e99-58a3-a0a7-1a92504ec20a" 00:04:45.158 ], 00:04:45.158 "product_name": "passthru", 00:04:45.158 "block_size": 512, 00:04:45.158 "num_blocks": 16384, 00:04:45.158 "uuid": "d5f06ff3-5e99-58a3-a0a7-1a92504ec20a", 00:04:45.158 "assigned_rate_limits": { 00:04:45.158 "rw_ios_per_sec": 0, 00:04:45.158 "rw_mbytes_per_sec": 0, 00:04:45.158 "r_mbytes_per_sec": 0, 00:04:45.158 "w_mbytes_per_sec": 0 00:04:45.158 }, 00:04:45.158 "claimed": false, 00:04:45.158 "zoned": false, 00:04:45.158 "supported_io_types": { 00:04:45.158 "read": true, 00:04:45.158 "write": true, 00:04:45.158 "unmap": true, 00:04:45.158 "write_zeroes": true, 00:04:45.158 "flush": true, 00:04:45.158 "reset": true, 00:04:45.158 "compare": false, 00:04:45.158 "compare_and_write": false, 00:04:45.158 "abort": true, 00:04:45.158 "nvme_admin": false, 00:04:45.158 "nvme_io": false 00:04:45.158 }, 00:04:45.158 "memory_domains": [ 00:04:45.158 { 00:04:45.158 "dma_device_id": "system", 00:04:45.158 "dma_device_type": 1 00:04:45.158 }, 00:04:45.158 { 00:04:45.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:45.158 "dma_device_type": 2 00:04:45.158 } 00:04:45.158 ], 00:04:45.158 "driver_specific": { 00:04:45.158 "passthru": { 00:04:45.158 "name": "Passthru0", 00:04:45.158 "base_bdev_name": "Malloc0" 00:04:45.158 } 00:04:45.158 } 00:04:45.158 } 00:04:45.158 ]' 00:04:45.158 10:23:07 -- rpc/rpc.sh@21 -- # jq length 00:04:45.158 10:23:07 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:45.158 10:23:07 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:45.158 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.158 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.158 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.158 10:23:07 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:04:45.158 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.158 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.158 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.158 10:23:07 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:45.158 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.158 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.418 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.418 10:23:07 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:45.418 10:23:07 -- rpc/rpc.sh@26 -- # jq length 00:04:45.418 10:23:07 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:45.418 00:04:45.418 real 0m0.284s 00:04:45.418 user 0m0.176s 00:04:45.418 sys 0m0.050s 00:04:45.418 10:23:07 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:45.418 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.418 ************************************ 00:04:45.418 END TEST rpc_integrity 00:04:45.418 ************************************ 00:04:45.418 10:23:07 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:04:45.418 10:23:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:45.418 10:23:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:45.418 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.418 ************************************ 00:04:45.418 START TEST rpc_plugins 00:04:45.418 ************************************ 00:04:45.418 10:23:07 -- common/autotest_common.sh@1111 -- # rpc_plugins 00:04:45.418 10:23:07 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:04:45.418 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.418 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.418 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.418 10:23:07 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:04:45.418 10:23:07 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:04:45.418 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.418 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.418 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.418 10:23:07 -- rpc/rpc.sh@31 -- # bdevs='[ 00:04:45.418 { 00:04:45.418 "name": "Malloc1", 00:04:45.418 "aliases": [ 00:04:45.418 "b5b0fed2-74dc-460c-9b4b-e1191dcdff72" 00:04:45.418 ], 00:04:45.418 "product_name": "Malloc disk", 00:04:45.418 "block_size": 4096, 00:04:45.418 "num_blocks": 256, 00:04:45.418 "uuid": "b5b0fed2-74dc-460c-9b4b-e1191dcdff72", 00:04:45.418 "assigned_rate_limits": { 00:04:45.418 "rw_ios_per_sec": 0, 00:04:45.418 "rw_mbytes_per_sec": 0, 00:04:45.418 "r_mbytes_per_sec": 0, 00:04:45.418 "w_mbytes_per_sec": 0 00:04:45.418 }, 00:04:45.418 "claimed": false, 00:04:45.418 "zoned": false, 00:04:45.418 "supported_io_types": { 00:04:45.418 "read": true, 00:04:45.418 "write": true, 00:04:45.418 "unmap": true, 00:04:45.418 "write_zeroes": true, 00:04:45.418 "flush": true, 00:04:45.418 "reset": true, 00:04:45.418 "compare": false, 00:04:45.418 "compare_and_write": false, 00:04:45.418 "abort": true, 00:04:45.418 "nvme_admin": false, 00:04:45.418 "nvme_io": false 00:04:45.418 }, 00:04:45.418 "memory_domains": [ 00:04:45.418 { 00:04:45.418 "dma_device_id": "system", 00:04:45.418 "dma_device_type": 1 00:04:45.418 }, 00:04:45.418 { 00:04:45.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:45.418 "dma_device_type": 2 00:04:45.418 } 00:04:45.418 ], 00:04:45.418 "driver_specific": {} 00:04:45.418 } 00:04:45.418 ]' 00:04:45.418 10:23:07 -- rpc/rpc.sh@32 -- # jq length 00:04:45.677 10:23:07 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:04:45.677 10:23:07 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:04:45.677 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.677 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.677 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.677 10:23:07 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:04:45.677 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.677 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.677 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.677 10:23:07 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:04:45.677 10:23:07 -- rpc/rpc.sh@36 -- # jq length 00:04:45.677 10:23:07 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:04:45.677 00:04:45.677 real 0m0.146s 00:04:45.677 user 0m0.086s 00:04:45.677 sys 0m0.025s 00:04:45.677 10:23:07 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:45.677 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.677 ************************************ 00:04:45.677 END TEST rpc_plugins 00:04:45.677 ************************************ 00:04:45.677 10:23:07 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:04:45.677 10:23:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:45.677 10:23:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:45.677 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.677 ************************************ 00:04:45.677 START TEST rpc_trace_cmd_test 00:04:45.677 ************************************ 00:04:45.677 10:23:07 -- common/autotest_common.sh@1111 -- # rpc_trace_cmd_test 00:04:45.677 10:23:07 -- rpc/rpc.sh@40 -- # local info 00:04:45.677 10:23:07 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:04:45.677 10:23:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:45.677 10:23:07 -- common/autotest_common.sh@10 -- # set +x 00:04:45.937 10:23:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:45.937 10:23:07 -- rpc/rpc.sh@42 -- # info='{ 00:04:45.937 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid182578", 00:04:45.937 "tpoint_group_mask": "0x8", 00:04:45.937 "iscsi_conn": { 00:04:45.937 "mask": "0x2", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "scsi": { 00:04:45.937 "mask": "0x4", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "bdev": { 00:04:45.937 "mask": "0x8", 00:04:45.937 "tpoint_mask": "0xffffffffffffffff" 00:04:45.937 }, 00:04:45.937 "nvmf_rdma": { 00:04:45.937 "mask": "0x10", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "nvmf_tcp": { 00:04:45.937 "mask": "0x20", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "ftl": { 00:04:45.937 "mask": "0x40", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "blobfs": { 00:04:45.937 "mask": "0x80", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "dsa": { 00:04:45.937 "mask": "0x200", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "thread": { 00:04:45.937 "mask": "0x400", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "nvme_pcie": { 00:04:45.937 "mask": "0x800", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "iaa": { 00:04:45.937 "mask": "0x1000", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "nvme_tcp": { 00:04:45.937 "mask": "0x2000", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "bdev_nvme": { 00:04:45.937 "mask": "0x4000", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 }, 00:04:45.937 "sock": { 00:04:45.937 "mask": "0x8000", 00:04:45.937 "tpoint_mask": "0x0" 00:04:45.937 } 00:04:45.937 }' 00:04:45.937 10:23:07 -- rpc/rpc.sh@43 -- # jq length 00:04:45.937 10:23:07 -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:04:45.937 10:23:07 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:04:45.937 10:23:07 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:04:45.937 10:23:07 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:04:45.937 10:23:07 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:04:45.937 10:23:07 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:04:45.937 10:23:07 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:04:45.937 10:23:07 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:04:45.937 10:23:08 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:04:45.937 00:04:45.937 real 0m0.236s 00:04:45.937 user 0m0.186s 00:04:45.937 sys 0m0.040s 00:04:45.937 10:23:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:45.937 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:45.937 ************************************ 00:04:45.937 END TEST rpc_trace_cmd_test 00:04:45.937 ************************************ 00:04:46.196 10:23:08 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:04:46.196 10:23:08 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:04:46.196 10:23:08 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:04:46.196 10:23:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:46.196 10:23:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:46.196 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.196 ************************************ 00:04:46.196 START TEST rpc_daemon_integrity 00:04:46.196 ************************************ 00:04:46.196 10:23:08 -- common/autotest_common.sh@1111 -- # rpc_integrity 00:04:46.196 10:23:08 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:46.196 10:23:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:46.196 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.196 10:23:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:46.196 10:23:08 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:46.196 10:23:08 -- rpc/rpc.sh@13 -- # jq length 00:04:46.196 10:23:08 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:46.196 10:23:08 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:46.196 10:23:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:46.196 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.196 10:23:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:46.196 10:23:08 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:04:46.196 10:23:08 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:46.196 10:23:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:46.196 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.196 10:23:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:46.196 10:23:08 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:46.196 { 00:04:46.196 "name": "Malloc2", 00:04:46.196 "aliases": [ 00:04:46.196 "584d75a5-da6f-4eca-98e8-22d02cb3932d" 00:04:46.196 ], 00:04:46.196 "product_name": "Malloc disk", 00:04:46.196 "block_size": 512, 00:04:46.196 "num_blocks": 16384, 00:04:46.196 "uuid": "584d75a5-da6f-4eca-98e8-22d02cb3932d", 00:04:46.196 "assigned_rate_limits": { 00:04:46.196 "rw_ios_per_sec": 0, 00:04:46.196 "rw_mbytes_per_sec": 0, 00:04:46.196 "r_mbytes_per_sec": 0, 00:04:46.196 "w_mbytes_per_sec": 0 00:04:46.196 }, 00:04:46.196 "claimed": false, 00:04:46.196 "zoned": false, 00:04:46.196 "supported_io_types": { 00:04:46.196 "read": true, 00:04:46.196 "write": true, 00:04:46.196 "unmap": true, 00:04:46.196 "write_zeroes": true, 00:04:46.196 "flush": true, 00:04:46.196 "reset": true, 00:04:46.196 "compare": false, 00:04:46.196 "compare_and_write": false, 00:04:46.196 "abort": true, 00:04:46.196 "nvme_admin": false, 00:04:46.196 "nvme_io": false 00:04:46.197 }, 00:04:46.197 "memory_domains": [ 00:04:46.197 { 00:04:46.197 "dma_device_id": "system", 00:04:46.197 "dma_device_type": 1 00:04:46.197 }, 00:04:46.197 { 00:04:46.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:46.197 "dma_device_type": 2 00:04:46.197 } 00:04:46.197 ], 00:04:46.197 "driver_specific": {} 00:04:46.197 } 00:04:46.197 ]' 00:04:46.197 10:23:08 -- rpc/rpc.sh@17 -- # jq length 00:04:46.456 10:23:08 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:46.456 10:23:08 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:04:46.456 10:23:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:46.456 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.456 [2024-04-19 10:23:08.318904] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:04:46.456 [2024-04-19 10:23:08.318936] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:46.456 [2024-04-19 10:23:08.318952] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x581a1c0 00:04:46.456 [2024-04-19 10:23:08.318962] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:46.456 [2024-04-19 10:23:08.319702] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:46.456 [2024-04-19 10:23:08.319725] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:46.456 Passthru0 00:04:46.456 10:23:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:46.456 10:23:08 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:46.456 10:23:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:46.456 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.456 10:23:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:46.456 10:23:08 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:46.456 { 00:04:46.456 "name": "Malloc2", 00:04:46.456 "aliases": [ 00:04:46.456 "584d75a5-da6f-4eca-98e8-22d02cb3932d" 00:04:46.456 ], 00:04:46.456 "product_name": "Malloc disk", 00:04:46.456 "block_size": 512, 00:04:46.456 "num_blocks": 16384, 00:04:46.456 "uuid": "584d75a5-da6f-4eca-98e8-22d02cb3932d", 00:04:46.456 "assigned_rate_limits": { 00:04:46.457 "rw_ios_per_sec": 0, 00:04:46.457 "rw_mbytes_per_sec": 0, 00:04:46.457 "r_mbytes_per_sec": 0, 00:04:46.457 "w_mbytes_per_sec": 0 00:04:46.457 }, 00:04:46.457 "claimed": true, 00:04:46.457 "claim_type": "exclusive_write", 00:04:46.457 "zoned": false, 00:04:46.457 "supported_io_types": { 00:04:46.457 "read": true, 00:04:46.457 "write": true, 00:04:46.457 "unmap": true, 00:04:46.457 "write_zeroes": true, 00:04:46.457 "flush": true, 00:04:46.457 "reset": true, 00:04:46.457 "compare": false, 00:04:46.457 "compare_and_write": false, 00:04:46.457 "abort": true, 00:04:46.457 "nvme_admin": false, 00:04:46.457 "nvme_io": false 00:04:46.457 }, 00:04:46.457 "memory_domains": [ 00:04:46.457 { 00:04:46.457 "dma_device_id": "system", 00:04:46.457 "dma_device_type": 1 00:04:46.457 }, 00:04:46.457 { 00:04:46.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:46.457 "dma_device_type": 2 00:04:46.457 } 00:04:46.457 ], 00:04:46.457 "driver_specific": {} 00:04:46.457 }, 00:04:46.457 { 00:04:46.457 "name": "Passthru0", 00:04:46.457 "aliases": [ 00:04:46.457 "58ed991f-7051-5e90-b814-6d4ed3f81075" 00:04:46.457 ], 00:04:46.457 "product_name": "passthru", 00:04:46.457 "block_size": 512, 00:04:46.457 "num_blocks": 16384, 00:04:46.457 "uuid": "58ed991f-7051-5e90-b814-6d4ed3f81075", 00:04:46.457 "assigned_rate_limits": { 00:04:46.457 "rw_ios_per_sec": 0, 00:04:46.457 "rw_mbytes_per_sec": 0, 00:04:46.457 "r_mbytes_per_sec": 0, 00:04:46.457 "w_mbytes_per_sec": 0 00:04:46.457 }, 00:04:46.457 "claimed": false, 00:04:46.457 "zoned": false, 00:04:46.457 "supported_io_types": { 00:04:46.457 "read": true, 00:04:46.457 "write": true, 00:04:46.457 "unmap": true, 00:04:46.457 "write_zeroes": true, 00:04:46.457 "flush": true, 00:04:46.457 "reset": true, 00:04:46.457 "compare": false, 00:04:46.457 "compare_and_write": false, 00:04:46.457 "abort": true, 00:04:46.457 "nvme_admin": false, 00:04:46.457 "nvme_io": false 00:04:46.457 }, 00:04:46.457 "memory_domains": [ 00:04:46.457 { 00:04:46.457 "dma_device_id": "system", 00:04:46.457 "dma_device_type": 1 00:04:46.457 }, 00:04:46.457 { 00:04:46.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:46.457 "dma_device_type": 2 00:04:46.457 } 00:04:46.457 ], 00:04:46.457 "driver_specific": { 00:04:46.457 "passthru": { 00:04:46.457 "name": "Passthru0", 00:04:46.457 "base_bdev_name": "Malloc2" 00:04:46.457 } 00:04:46.457 } 00:04:46.457 } 00:04:46.457 ]' 00:04:46.457 10:23:08 -- rpc/rpc.sh@21 -- # jq length 00:04:46.457 10:23:08 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:46.457 10:23:08 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:46.457 10:23:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:46.457 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.457 10:23:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:46.457 10:23:08 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:04:46.457 10:23:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:46.457 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.457 10:23:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:46.457 10:23:08 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:46.457 10:23:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:46.457 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.457 10:23:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:46.457 10:23:08 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:46.457 10:23:08 -- rpc/rpc.sh@26 -- # jq length 00:04:46.457 10:23:08 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:46.457 00:04:46.457 real 0m0.275s 00:04:46.457 user 0m0.165s 00:04:46.457 sys 0m0.045s 00:04:46.457 10:23:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:46.457 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.457 ************************************ 00:04:46.457 END TEST rpc_daemon_integrity 00:04:46.457 ************************************ 00:04:46.457 10:23:08 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:04:46.457 10:23:08 -- rpc/rpc.sh@84 -- # killprocess 182578 00:04:46.457 10:23:08 -- common/autotest_common.sh@936 -- # '[' -z 182578 ']' 00:04:46.457 10:23:08 -- common/autotest_common.sh@940 -- # kill -0 182578 00:04:46.457 10:23:08 -- common/autotest_common.sh@941 -- # uname 00:04:46.457 10:23:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:46.457 10:23:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 182578 00:04:46.457 10:23:08 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:46.457 10:23:08 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:46.457 10:23:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 182578' 00:04:46.457 killing process with pid 182578 00:04:46.457 10:23:08 -- common/autotest_common.sh@955 -- # kill 182578 00:04:46.457 10:23:08 -- common/autotest_common.sh@960 -- # wait 182578 00:04:47.027 00:04:47.027 real 0m2.891s 00:04:47.027 user 0m3.660s 00:04:47.027 sys 0m0.974s 00:04:47.027 10:23:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:47.027 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:47.027 ************************************ 00:04:47.027 END TEST rpc 00:04:47.027 ************************************ 00:04:47.027 10:23:08 -- spdk/autotest.sh@166 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:04:47.027 10:23:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:47.027 10:23:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:47.027 10:23:08 -- common/autotest_common.sh@10 -- # set +x 00:04:47.027 ************************************ 00:04:47.027 START TEST skip_rpc 00:04:47.027 ************************************ 00:04:47.027 10:23:09 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:04:47.027 * Looking for test storage... 00:04:47.027 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:47.028 10:23:09 -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:04:47.028 10:23:09 -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:04:47.028 10:23:09 -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:04:47.028 10:23:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:47.028 10:23:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:47.028 10:23:09 -- common/autotest_common.sh@10 -- # set +x 00:04:47.287 ************************************ 00:04:47.287 START TEST skip_rpc 00:04:47.287 ************************************ 00:04:47.287 10:23:09 -- common/autotest_common.sh@1111 -- # test_skip_rpc 00:04:47.287 10:23:09 -- rpc/skip_rpc.sh@16 -- # local spdk_pid=183168 00:04:47.287 10:23:09 -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:47.287 10:23:09 -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:04:47.287 10:23:09 -- rpc/skip_rpc.sh@19 -- # sleep 5 00:04:47.287 [2024-04-19 10:23:09.272883] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:04:47.287 [2024-04-19 10:23:09.272963] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid183168 ] 00:04:47.287 EAL: No free 2048 kB hugepages reported on node 1 00:04:47.287 [2024-04-19 10:23:09.357357] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:47.548 [2024-04-19 10:23:09.434740] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:04:52.822 10:23:14 -- common/autotest_common.sh@638 -- # local es=0 00:04:52.822 10:23:14 -- common/autotest_common.sh@640 -- # valid_exec_arg rpc_cmd spdk_get_version 00:04:52.822 10:23:14 -- common/autotest_common.sh@626 -- # local arg=rpc_cmd 00:04:52.822 10:23:14 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:04:52.822 10:23:14 -- common/autotest_common.sh@630 -- # type -t rpc_cmd 00:04:52.822 10:23:14 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:04:52.822 10:23:14 -- common/autotest_common.sh@641 -- # rpc_cmd spdk_get_version 00:04:52.822 10:23:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:52.822 10:23:14 -- common/autotest_common.sh@10 -- # set +x 00:04:52.822 10:23:14 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:04:52.822 10:23:14 -- common/autotest_common.sh@641 -- # es=1 00:04:52.822 10:23:14 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:04:52.822 10:23:14 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:04:52.822 10:23:14 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@23 -- # killprocess 183168 00:04:52.822 10:23:14 -- common/autotest_common.sh@936 -- # '[' -z 183168 ']' 00:04:52.822 10:23:14 -- common/autotest_common.sh@940 -- # kill -0 183168 00:04:52.822 10:23:14 -- common/autotest_common.sh@941 -- # uname 00:04:52.822 10:23:14 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:52.822 10:23:14 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 183168 00:04:52.822 10:23:14 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:52.822 10:23:14 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:52.822 10:23:14 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 183168' 00:04:52.822 killing process with pid 183168 00:04:52.822 10:23:14 -- common/autotest_common.sh@955 -- # kill 183168 00:04:52.822 10:23:14 -- common/autotest_common.sh@960 -- # wait 183168 00:04:52.822 00:04:52.822 real 0m5.367s 00:04:52.822 user 0m5.116s 00:04:52.822 sys 0m0.289s 00:04:52.822 10:23:14 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:52.822 10:23:14 -- common/autotest_common.sh@10 -- # set +x 00:04:52.822 ************************************ 00:04:52.822 END TEST skip_rpc 00:04:52.822 ************************************ 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:04:52.822 10:23:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:52.822 10:23:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:52.822 10:23:14 -- common/autotest_common.sh@10 -- # set +x 00:04:52.822 ************************************ 00:04:52.822 START TEST skip_rpc_with_json 00:04:52.822 ************************************ 00:04:52.822 10:23:14 -- common/autotest_common.sh@1111 -- # test_skip_rpc_with_json 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@28 -- # local spdk_pid=183968 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:04:52.822 10:23:14 -- rpc/skip_rpc.sh@31 -- # waitforlisten 183968 00:04:52.822 10:23:14 -- common/autotest_common.sh@817 -- # '[' -z 183968 ']' 00:04:52.822 10:23:14 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:52.822 10:23:14 -- common/autotest_common.sh@822 -- # local max_retries=100 00:04:52.822 10:23:14 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:52.822 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:52.822 10:23:14 -- common/autotest_common.sh@826 -- # xtrace_disable 00:04:52.822 10:23:14 -- common/autotest_common.sh@10 -- # set +x 00:04:52.822 [2024-04-19 10:23:14.812398] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:04:52.822 [2024-04-19 10:23:14.812488] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid183968 ] 00:04:52.822 EAL: No free 2048 kB hugepages reported on node 1 00:04:52.822 [2024-04-19 10:23:14.895368] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:53.082 [2024-04-19 10:23:14.982980] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:53.649 10:23:15 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:04:53.649 10:23:15 -- common/autotest_common.sh@850 -- # return 0 00:04:53.649 10:23:15 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:04:53.649 10:23:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:53.649 10:23:15 -- common/autotest_common.sh@10 -- # set +x 00:04:53.649 [2024-04-19 10:23:15.643774] nvmf_rpc.c:2504:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:04:53.649 request: 00:04:53.649 { 00:04:53.649 "trtype": "tcp", 00:04:53.649 "method": "nvmf_get_transports", 00:04:53.649 "req_id": 1 00:04:53.649 } 00:04:53.649 Got JSON-RPC error response 00:04:53.649 response: 00:04:53.649 { 00:04:53.649 "code": -19, 00:04:53.649 "message": "No such device" 00:04:53.649 } 00:04:53.649 10:23:15 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:04:53.649 10:23:15 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:04:53.649 10:23:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:53.649 10:23:15 -- common/autotest_common.sh@10 -- # set +x 00:04:53.649 [2024-04-19 10:23:15.655863] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:04:53.649 10:23:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:53.649 10:23:15 -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:04:53.649 10:23:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:04:53.649 10:23:15 -- common/autotest_common.sh@10 -- # set +x 00:04:53.909 10:23:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:04:53.909 10:23:15 -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:04:53.909 { 00:04:53.909 "subsystems": [ 00:04:53.909 { 00:04:53.909 "subsystem": "scheduler", 00:04:53.909 "config": [ 00:04:53.909 { 00:04:53.909 "method": "framework_set_scheduler", 00:04:53.909 "params": { 00:04:53.909 "name": "static" 00:04:53.909 } 00:04:53.909 } 00:04:53.909 ] 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "subsystem": "vmd", 00:04:53.909 "config": [] 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "subsystem": "sock", 00:04:53.909 "config": [ 00:04:53.909 { 00:04:53.909 "method": "sock_impl_set_options", 00:04:53.909 "params": { 00:04:53.909 "impl_name": "posix", 00:04:53.909 "recv_buf_size": 2097152, 00:04:53.909 "send_buf_size": 2097152, 00:04:53.909 "enable_recv_pipe": true, 00:04:53.909 "enable_quickack": false, 00:04:53.909 "enable_placement_id": 0, 00:04:53.909 "enable_zerocopy_send_server": true, 00:04:53.909 "enable_zerocopy_send_client": false, 00:04:53.909 "zerocopy_threshold": 0, 00:04:53.909 "tls_version": 0, 00:04:53.909 "enable_ktls": false 00:04:53.909 } 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "method": "sock_impl_set_options", 00:04:53.909 "params": { 00:04:53.909 "impl_name": "ssl", 00:04:53.909 "recv_buf_size": 4096, 00:04:53.909 "send_buf_size": 4096, 00:04:53.909 "enable_recv_pipe": true, 00:04:53.909 "enable_quickack": false, 00:04:53.909 "enable_placement_id": 0, 00:04:53.909 "enable_zerocopy_send_server": true, 00:04:53.909 "enable_zerocopy_send_client": false, 00:04:53.909 "zerocopy_threshold": 0, 00:04:53.909 "tls_version": 0, 00:04:53.909 "enable_ktls": false 00:04:53.909 } 00:04:53.909 } 00:04:53.909 ] 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "subsystem": "iobuf", 00:04:53.909 "config": [ 00:04:53.909 { 00:04:53.909 "method": "iobuf_set_options", 00:04:53.909 "params": { 00:04:53.909 "small_pool_count": 8192, 00:04:53.909 "large_pool_count": 1024, 00:04:53.909 "small_bufsize": 8192, 00:04:53.909 "large_bufsize": 135168 00:04:53.909 } 00:04:53.909 } 00:04:53.909 ] 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "subsystem": "keyring", 00:04:53.909 "config": [] 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "subsystem": "vfio_user_target", 00:04:53.909 "config": null 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "subsystem": "accel", 00:04:53.909 "config": [ 00:04:53.909 { 00:04:53.909 "method": "accel_set_options", 00:04:53.909 "params": { 00:04:53.909 "small_cache_size": 128, 00:04:53.909 "large_cache_size": 16, 00:04:53.909 "task_count": 2048, 00:04:53.909 "sequence_count": 2048, 00:04:53.909 "buf_count": 2048 00:04:53.909 } 00:04:53.909 } 00:04:53.909 ] 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "subsystem": "bdev", 00:04:53.909 "config": [ 00:04:53.909 { 00:04:53.909 "method": "bdev_set_options", 00:04:53.909 "params": { 00:04:53.909 "bdev_io_pool_size": 65535, 00:04:53.909 "bdev_io_cache_size": 256, 00:04:53.909 "bdev_auto_examine": true, 00:04:53.909 "iobuf_small_cache_size": 128, 00:04:53.909 "iobuf_large_cache_size": 16 00:04:53.909 } 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "method": "bdev_raid_set_options", 00:04:53.909 "params": { 00:04:53.909 "process_window_size_kb": 1024 00:04:53.909 } 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "method": "bdev_nvme_set_options", 00:04:53.909 "params": { 00:04:53.909 "action_on_timeout": "none", 00:04:53.909 "timeout_us": 0, 00:04:53.909 "timeout_admin_us": 0, 00:04:53.909 "keep_alive_timeout_ms": 10000, 00:04:53.909 "arbitration_burst": 0, 00:04:53.909 "low_priority_weight": 0, 00:04:53.909 "medium_priority_weight": 0, 00:04:53.909 "high_priority_weight": 0, 00:04:53.909 "nvme_adminq_poll_period_us": 10000, 00:04:53.909 "nvme_ioq_poll_period_us": 0, 00:04:53.909 "io_queue_requests": 0, 00:04:53.909 "delay_cmd_submit": true, 00:04:53.909 "transport_retry_count": 4, 00:04:53.909 "bdev_retry_count": 3, 00:04:53.909 "transport_ack_timeout": 0, 00:04:53.909 "ctrlr_loss_timeout_sec": 0, 00:04:53.909 "reconnect_delay_sec": 0, 00:04:53.909 "fast_io_fail_timeout_sec": 0, 00:04:53.909 "disable_auto_failback": false, 00:04:53.909 "generate_uuids": false, 00:04:53.909 "transport_tos": 0, 00:04:53.909 "nvme_error_stat": false, 00:04:53.909 "rdma_srq_size": 0, 00:04:53.909 "io_path_stat": false, 00:04:53.909 "allow_accel_sequence": false, 00:04:53.909 "rdma_max_cq_size": 0, 00:04:53.909 "rdma_cm_event_timeout_ms": 0, 00:04:53.909 "dhchap_digests": [ 00:04:53.909 "sha256", 00:04:53.909 "sha384", 00:04:53.909 "sha512" 00:04:53.909 ], 00:04:53.909 "dhchap_dhgroups": [ 00:04:53.909 "null", 00:04:53.909 "ffdhe2048", 00:04:53.909 "ffdhe3072", 00:04:53.909 "ffdhe4096", 00:04:53.909 "ffdhe6144", 00:04:53.909 "ffdhe8192" 00:04:53.909 ] 00:04:53.909 } 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "method": "bdev_nvme_set_hotplug", 00:04:53.909 "params": { 00:04:53.909 "period_us": 100000, 00:04:53.909 "enable": false 00:04:53.909 } 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "method": "bdev_iscsi_set_options", 00:04:53.909 "params": { 00:04:53.909 "timeout_sec": 30 00:04:53.909 } 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "method": "bdev_wait_for_examine" 00:04:53.909 } 00:04:53.909 ] 00:04:53.909 }, 00:04:53.909 { 00:04:53.909 "subsystem": "nvmf", 00:04:53.909 "config": [ 00:04:53.909 { 00:04:53.909 "method": "nvmf_set_config", 00:04:53.909 "params": { 00:04:53.909 "discovery_filter": "match_any", 00:04:53.909 "admin_cmd_passthru": { 00:04:53.909 "identify_ctrlr": false 00:04:53.909 } 00:04:53.909 } 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "method": "nvmf_set_max_subsystems", 00:04:53.910 "params": { 00:04:53.910 "max_subsystems": 1024 00:04:53.910 } 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "method": "nvmf_set_crdt", 00:04:53.910 "params": { 00:04:53.910 "crdt1": 0, 00:04:53.910 "crdt2": 0, 00:04:53.910 "crdt3": 0 00:04:53.910 } 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "method": "nvmf_create_transport", 00:04:53.910 "params": { 00:04:53.910 "trtype": "TCP", 00:04:53.910 "max_queue_depth": 128, 00:04:53.910 "max_io_qpairs_per_ctrlr": 127, 00:04:53.910 "in_capsule_data_size": 4096, 00:04:53.910 "max_io_size": 131072, 00:04:53.910 "io_unit_size": 131072, 00:04:53.910 "max_aq_depth": 128, 00:04:53.910 "num_shared_buffers": 511, 00:04:53.910 "buf_cache_size": 4294967295, 00:04:53.910 "dif_insert_or_strip": false, 00:04:53.910 "zcopy": false, 00:04:53.910 "c2h_success": true, 00:04:53.910 "sock_priority": 0, 00:04:53.910 "abort_timeout_sec": 1, 00:04:53.910 "ack_timeout": 0 00:04:53.910 } 00:04:53.910 } 00:04:53.910 ] 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "subsystem": "nbd", 00:04:53.910 "config": [] 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "subsystem": "ublk", 00:04:53.910 "config": [] 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "subsystem": "vhost_blk", 00:04:53.910 "config": [] 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "subsystem": "scsi", 00:04:53.910 "config": null 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "subsystem": "iscsi", 00:04:53.910 "config": [ 00:04:53.910 { 00:04:53.910 "method": "iscsi_set_options", 00:04:53.910 "params": { 00:04:53.910 "node_base": "iqn.2016-06.io.spdk", 00:04:53.910 "max_sessions": 128, 00:04:53.910 "max_connections_per_session": 2, 00:04:53.910 "max_queue_depth": 64, 00:04:53.910 "default_time2wait": 2, 00:04:53.910 "default_time2retain": 20, 00:04:53.910 "first_burst_length": 8192, 00:04:53.910 "immediate_data": true, 00:04:53.910 "allow_duplicated_isid": false, 00:04:53.910 "error_recovery_level": 0, 00:04:53.910 "nop_timeout": 60, 00:04:53.910 "nop_in_interval": 30, 00:04:53.910 "disable_chap": false, 00:04:53.910 "require_chap": false, 00:04:53.910 "mutual_chap": false, 00:04:53.910 "chap_group": 0, 00:04:53.910 "max_large_datain_per_connection": 64, 00:04:53.910 "max_r2t_per_connection": 4, 00:04:53.910 "pdu_pool_size": 36864, 00:04:53.910 "immediate_data_pool_size": 16384, 00:04:53.910 "data_out_pool_size": 2048 00:04:53.910 } 00:04:53.910 } 00:04:53.910 ] 00:04:53.910 }, 00:04:53.910 { 00:04:53.910 "subsystem": "vhost_scsi", 00:04:53.910 "config": [] 00:04:53.910 } 00:04:53.910 ] 00:04:53.910 } 00:04:53.910 10:23:15 -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:04:53.910 10:23:15 -- rpc/skip_rpc.sh@40 -- # killprocess 183968 00:04:53.910 10:23:15 -- common/autotest_common.sh@936 -- # '[' -z 183968 ']' 00:04:53.910 10:23:15 -- common/autotest_common.sh@940 -- # kill -0 183968 00:04:53.910 10:23:15 -- common/autotest_common.sh@941 -- # uname 00:04:53.910 10:23:15 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:53.910 10:23:15 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 183968 00:04:53.910 10:23:15 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:53.910 10:23:15 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:53.910 10:23:15 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 183968' 00:04:53.910 killing process with pid 183968 00:04:53.910 10:23:15 -- common/autotest_common.sh@955 -- # kill 183968 00:04:53.910 10:23:15 -- common/autotest_common.sh@960 -- # wait 183968 00:04:54.170 10:23:16 -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:04:54.170 10:23:16 -- rpc/skip_rpc.sh@47 -- # local spdk_pid=184184 00:04:54.170 10:23:16 -- rpc/skip_rpc.sh@48 -- # sleep 5 00:04:59.443 10:23:21 -- rpc/skip_rpc.sh@50 -- # killprocess 184184 00:04:59.443 10:23:21 -- common/autotest_common.sh@936 -- # '[' -z 184184 ']' 00:04:59.443 10:23:21 -- common/autotest_common.sh@940 -- # kill -0 184184 00:04:59.443 10:23:21 -- common/autotest_common.sh@941 -- # uname 00:04:59.443 10:23:21 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:59.443 10:23:21 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 184184 00:04:59.443 10:23:21 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:59.443 10:23:21 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:59.443 10:23:21 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 184184' 00:04:59.443 killing process with pid 184184 00:04:59.443 10:23:21 -- common/autotest_common.sh@955 -- # kill 184184 00:04:59.443 10:23:21 -- common/autotest_common.sh@960 -- # wait 184184 00:04:59.443 10:23:21 -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:04:59.443 10:23:21 -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:04:59.443 00:04:59.443 real 0m6.752s 00:04:59.443 user 0m6.555s 00:04:59.443 sys 0m0.644s 00:04:59.443 10:23:21 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:59.443 10:23:21 -- common/autotest_common.sh@10 -- # set +x 00:04:59.443 ************************************ 00:04:59.443 END TEST skip_rpc_with_json 00:04:59.443 ************************************ 00:04:59.703 10:23:21 -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:04:59.703 10:23:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:59.703 10:23:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:59.703 10:23:21 -- common/autotest_common.sh@10 -- # set +x 00:04:59.703 ************************************ 00:04:59.703 START TEST skip_rpc_with_delay 00:04:59.703 ************************************ 00:04:59.703 10:23:21 -- common/autotest_common.sh@1111 -- # test_skip_rpc_with_delay 00:04:59.703 10:23:21 -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:04:59.703 10:23:21 -- common/autotest_common.sh@638 -- # local es=0 00:04:59.703 10:23:21 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:04:59.703 10:23:21 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:59.703 10:23:21 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:04:59.703 10:23:21 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:59.703 10:23:21 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:04:59.703 10:23:21 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:59.703 10:23:21 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:04:59.703 10:23:21 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:59.703 10:23:21 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:04:59.703 10:23:21 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:04:59.703 [2024-04-19 10:23:21.738790] app.c: 751:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:04:59.703 [2024-04-19 10:23:21.738953] app.c: 630:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:04:59.703 10:23:21 -- common/autotest_common.sh@641 -- # es=1 00:04:59.703 10:23:21 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:04:59.703 10:23:21 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:04:59.703 10:23:21 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:04:59.703 00:04:59.703 real 0m0.046s 00:04:59.703 user 0m0.016s 00:04:59.703 sys 0m0.029s 00:04:59.703 10:23:21 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:59.703 10:23:21 -- common/autotest_common.sh@10 -- # set +x 00:04:59.703 ************************************ 00:04:59.703 END TEST skip_rpc_with_delay 00:04:59.703 ************************************ 00:04:59.703 10:23:21 -- rpc/skip_rpc.sh@77 -- # uname 00:04:59.703 10:23:21 -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:04:59.703 10:23:21 -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:04:59.703 10:23:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:59.703 10:23:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:59.703 10:23:21 -- common/autotest_common.sh@10 -- # set +x 00:04:59.962 ************************************ 00:04:59.962 START TEST exit_on_failed_rpc_init 00:04:59.962 ************************************ 00:04:59.962 10:23:21 -- common/autotest_common.sh@1111 -- # test_exit_on_failed_rpc_init 00:04:59.962 10:23:21 -- rpc/skip_rpc.sh@62 -- # local spdk_pid=184944 00:04:59.962 10:23:21 -- rpc/skip_rpc.sh@63 -- # waitforlisten 184944 00:04:59.962 10:23:21 -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:04:59.962 10:23:21 -- common/autotest_common.sh@817 -- # '[' -z 184944 ']' 00:04:59.962 10:23:21 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:59.962 10:23:21 -- common/autotest_common.sh@822 -- # local max_retries=100 00:04:59.962 10:23:21 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:59.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:59.962 10:23:21 -- common/autotest_common.sh@826 -- # xtrace_disable 00:04:59.962 10:23:21 -- common/autotest_common.sh@10 -- # set +x 00:04:59.962 [2024-04-19 10:23:21.960761] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:04:59.962 [2024-04-19 10:23:21.960852] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid184944 ] 00:04:59.962 EAL: No free 2048 kB hugepages reported on node 1 00:04:59.962 [2024-04-19 10:23:22.044731] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:00.221 [2024-04-19 10:23:22.130210] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:00.790 10:23:22 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:00.790 10:23:22 -- common/autotest_common.sh@850 -- # return 0 00:05:00.790 10:23:22 -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:00.790 10:23:22 -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:00.790 10:23:22 -- common/autotest_common.sh@638 -- # local es=0 00:05:00.790 10:23:22 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:00.790 10:23:22 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:00.790 10:23:22 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:00.790 10:23:22 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:00.790 10:23:22 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:00.790 10:23:22 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:00.790 10:23:22 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:00.790 10:23:22 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:00.790 10:23:22 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:00.790 10:23:22 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:00.790 [2024-04-19 10:23:22.814333] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:00.790 [2024-04-19 10:23:22.814414] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid185052 ] 00:05:00.790 EAL: No free 2048 kB hugepages reported on node 1 00:05:00.790 [2024-04-19 10:23:22.900980] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:01.050 [2024-04-19 10:23:22.977587] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:01.050 [2024-04-19 10:23:22.977671] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:01.050 [2024-04-19 10:23:22.977684] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:01.050 [2024-04-19 10:23:22.977692] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:01.050 10:23:23 -- common/autotest_common.sh@641 -- # es=234 00:05:01.050 10:23:23 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:01.050 10:23:23 -- common/autotest_common.sh@650 -- # es=106 00:05:01.050 10:23:23 -- common/autotest_common.sh@651 -- # case "$es" in 00:05:01.050 10:23:23 -- common/autotest_common.sh@658 -- # es=1 00:05:01.050 10:23:23 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:01.050 10:23:23 -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:01.050 10:23:23 -- rpc/skip_rpc.sh@70 -- # killprocess 184944 00:05:01.050 10:23:23 -- common/autotest_common.sh@936 -- # '[' -z 184944 ']' 00:05:01.050 10:23:23 -- common/autotest_common.sh@940 -- # kill -0 184944 00:05:01.050 10:23:23 -- common/autotest_common.sh@941 -- # uname 00:05:01.050 10:23:23 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:01.050 10:23:23 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 184944 00:05:01.050 10:23:23 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:01.050 10:23:23 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:01.050 10:23:23 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 184944' 00:05:01.050 killing process with pid 184944 00:05:01.050 10:23:23 -- common/autotest_common.sh@955 -- # kill 184944 00:05:01.050 10:23:23 -- common/autotest_common.sh@960 -- # wait 184944 00:05:01.310 00:05:01.310 real 0m1.458s 00:05:01.310 user 0m1.643s 00:05:01.310 sys 0m0.450s 00:05:01.310 10:23:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:01.310 10:23:23 -- common/autotest_common.sh@10 -- # set +x 00:05:01.310 ************************************ 00:05:01.310 END TEST exit_on_failed_rpc_init 00:05:01.310 ************************************ 00:05:01.569 10:23:23 -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:01.569 00:05:01.569 real 0m14.424s 00:05:01.569 user 0m13.603s 00:05:01.569 sys 0m1.905s 00:05:01.569 10:23:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:01.569 10:23:23 -- common/autotest_common.sh@10 -- # set +x 00:05:01.569 ************************************ 00:05:01.569 END TEST skip_rpc 00:05:01.569 ************************************ 00:05:01.569 10:23:23 -- spdk/autotest.sh@167 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:01.569 10:23:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:01.569 10:23:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:01.569 10:23:23 -- common/autotest_common.sh@10 -- # set +x 00:05:01.569 ************************************ 00:05:01.569 START TEST rpc_client 00:05:01.569 ************************************ 00:05:01.569 10:23:23 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:01.829 * Looking for test storage... 00:05:01.829 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:01.829 10:23:23 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:01.829 OK 00:05:01.829 10:23:23 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:01.829 00:05:01.829 real 0m0.136s 00:05:01.829 user 0m0.053s 00:05:01.829 sys 0m0.093s 00:05:01.829 10:23:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:01.829 10:23:23 -- common/autotest_common.sh@10 -- # set +x 00:05:01.829 ************************************ 00:05:01.829 END TEST rpc_client 00:05:01.829 ************************************ 00:05:01.829 10:23:23 -- spdk/autotest.sh@168 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:01.829 10:23:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:01.829 10:23:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:01.829 10:23:23 -- common/autotest_common.sh@10 -- # set +x 00:05:01.829 ************************************ 00:05:01.829 START TEST json_config 00:05:01.829 ************************************ 00:05:01.829 10:23:23 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:02.089 10:23:23 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:02.089 10:23:23 -- nvmf/common.sh@7 -- # uname -s 00:05:02.089 10:23:23 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:02.089 10:23:23 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:02.089 10:23:23 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:02.089 10:23:23 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:02.089 10:23:24 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:02.089 10:23:24 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:02.089 10:23:24 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:02.089 10:23:24 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:02.089 10:23:24 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:02.089 10:23:24 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:02.089 10:23:24 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:05:02.089 10:23:24 -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:05:02.089 10:23:24 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:02.089 10:23:24 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:02.089 10:23:24 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:02.089 10:23:24 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:02.089 10:23:24 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:02.089 10:23:24 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:02.089 10:23:24 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:02.089 10:23:24 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:02.089 10:23:24 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:02.089 10:23:24 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:02.089 10:23:24 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:02.089 10:23:24 -- paths/export.sh@5 -- # export PATH 00:05:02.089 10:23:24 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:02.089 10:23:24 -- nvmf/common.sh@47 -- # : 0 00:05:02.089 10:23:24 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:02.089 10:23:24 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:02.089 10:23:24 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:02.089 10:23:24 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:02.089 10:23:24 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:02.089 10:23:24 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:02.089 10:23:24 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:02.089 10:23:24 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:02.089 10:23:24 -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:05:02.089 10:23:24 -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:02.089 10:23:24 -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:02.089 10:23:24 -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:02.089 10:23:24 -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:02.089 10:23:24 -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:02.089 WARNING: No tests are enabled so not running JSON configuration tests 00:05:02.089 10:23:24 -- json_config/json_config.sh@28 -- # exit 0 00:05:02.089 00:05:02.089 real 0m0.114s 00:05:02.089 user 0m0.055s 00:05:02.089 sys 0m0.061s 00:05:02.089 10:23:24 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:02.089 10:23:24 -- common/autotest_common.sh@10 -- # set +x 00:05:02.089 ************************************ 00:05:02.089 END TEST json_config 00:05:02.089 ************************************ 00:05:02.089 10:23:24 -- spdk/autotest.sh@169 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:02.089 10:23:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:02.089 10:23:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:02.089 10:23:24 -- common/autotest_common.sh@10 -- # set +x 00:05:02.089 ************************************ 00:05:02.089 START TEST json_config_extra_key 00:05:02.089 ************************************ 00:05:02.089 10:23:24 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:02.348 10:23:24 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:02.348 10:23:24 -- nvmf/common.sh@7 -- # uname -s 00:05:02.348 10:23:24 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:02.348 10:23:24 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:02.348 10:23:24 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:02.348 10:23:24 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:02.348 10:23:24 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:02.348 10:23:24 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:02.348 10:23:24 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:02.348 10:23:24 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:02.348 10:23:24 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:02.348 10:23:24 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:02.348 10:23:24 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:800e967b-538f-e911-906e-001635649f5c 00:05:02.348 10:23:24 -- nvmf/common.sh@18 -- # NVME_HOSTID=800e967b-538f-e911-906e-001635649f5c 00:05:02.348 10:23:24 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:02.348 10:23:24 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:02.349 10:23:24 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:02.349 10:23:24 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:02.349 10:23:24 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:02.349 10:23:24 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:02.349 10:23:24 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:02.349 10:23:24 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:02.349 10:23:24 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:02.349 10:23:24 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:02.349 10:23:24 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:02.349 10:23:24 -- paths/export.sh@5 -- # export PATH 00:05:02.349 10:23:24 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:02.349 10:23:24 -- nvmf/common.sh@47 -- # : 0 00:05:02.349 10:23:24 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:02.349 10:23:24 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:02.349 10:23:24 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:02.349 10:23:24 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:02.349 10:23:24 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:02.349 10:23:24 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:02.349 10:23:24 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:02.349 10:23:24 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:05:02.349 INFO: launching applications... 00:05:02.349 10:23:24 -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:02.349 10:23:24 -- json_config/common.sh@9 -- # local app=target 00:05:02.349 10:23:24 -- json_config/common.sh@10 -- # shift 00:05:02.349 10:23:24 -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:02.349 10:23:24 -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:02.349 10:23:24 -- json_config/common.sh@15 -- # local app_extra_params= 00:05:02.349 10:23:24 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:02.349 10:23:24 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:02.349 10:23:24 -- json_config/common.sh@22 -- # app_pid["$app"]=185455 00:05:02.349 10:23:24 -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:02.349 Waiting for target to run... 00:05:02.349 10:23:24 -- json_config/common.sh@25 -- # waitforlisten 185455 /var/tmp/spdk_tgt.sock 00:05:02.349 10:23:24 -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:02.349 10:23:24 -- common/autotest_common.sh@817 -- # '[' -z 185455 ']' 00:05:02.349 10:23:24 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:02.349 10:23:24 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:02.349 10:23:24 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:02.349 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:02.349 10:23:24 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:02.349 10:23:24 -- common/autotest_common.sh@10 -- # set +x 00:05:02.349 [2024-04-19 10:23:24.336435] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:02.349 [2024-04-19 10:23:24.336516] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid185455 ] 00:05:02.349 EAL: No free 2048 kB hugepages reported on node 1 00:05:02.608 [2024-04-19 10:23:24.635444] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:02.608 [2024-04-19 10:23:24.702364] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:03.177 10:23:25 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:03.177 10:23:25 -- common/autotest_common.sh@850 -- # return 0 00:05:03.177 10:23:25 -- json_config/common.sh@26 -- # echo '' 00:05:03.177 00:05:03.177 10:23:25 -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:05:03.177 INFO: shutting down applications... 00:05:03.177 10:23:25 -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:05:03.178 10:23:25 -- json_config/common.sh@31 -- # local app=target 00:05:03.178 10:23:25 -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:03.178 10:23:25 -- json_config/common.sh@35 -- # [[ -n 185455 ]] 00:05:03.178 10:23:25 -- json_config/common.sh@38 -- # kill -SIGINT 185455 00:05:03.178 10:23:25 -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:03.178 10:23:25 -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:03.178 10:23:25 -- json_config/common.sh@41 -- # kill -0 185455 00:05:03.178 10:23:25 -- json_config/common.sh@45 -- # sleep 0.5 00:05:03.747 10:23:25 -- json_config/common.sh@40 -- # (( i++ )) 00:05:03.747 10:23:25 -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:03.747 10:23:25 -- json_config/common.sh@41 -- # kill -0 185455 00:05:03.747 10:23:25 -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:03.747 10:23:25 -- json_config/common.sh@43 -- # break 00:05:03.747 10:23:25 -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:03.747 10:23:25 -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:03.747 SPDK target shutdown done 00:05:03.747 10:23:25 -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:05:03.747 Success 00:05:03.747 00:05:03.747 real 0m1.469s 00:05:03.747 user 0m1.204s 00:05:03.747 sys 0m0.407s 00:05:03.747 10:23:25 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:03.747 10:23:25 -- common/autotest_common.sh@10 -- # set +x 00:05:03.747 ************************************ 00:05:03.747 END TEST json_config_extra_key 00:05:03.747 ************************************ 00:05:03.747 10:23:25 -- spdk/autotest.sh@170 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:03.747 10:23:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:03.747 10:23:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:03.747 10:23:25 -- common/autotest_common.sh@10 -- # set +x 00:05:03.747 ************************************ 00:05:03.747 START TEST alias_rpc 00:05:03.747 ************************************ 00:05:03.747 10:23:25 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:04.007 * Looking for test storage... 00:05:04.007 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:04.007 10:23:25 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:04.007 10:23:25 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=185680 00:05:04.007 10:23:25 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 185680 00:05:04.007 10:23:25 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:04.007 10:23:25 -- common/autotest_common.sh@817 -- # '[' -z 185680 ']' 00:05:04.007 10:23:25 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:04.007 10:23:25 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:04.007 10:23:25 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:04.007 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:04.007 10:23:25 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:04.007 10:23:25 -- common/autotest_common.sh@10 -- # set +x 00:05:04.007 [2024-04-19 10:23:25.971959] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:04.007 [2024-04-19 10:23:25.972033] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid185680 ] 00:05:04.007 EAL: No free 2048 kB hugepages reported on node 1 00:05:04.007 [2024-04-19 10:23:26.053795] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:04.266 [2024-04-19 10:23:26.139308] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:04.836 10:23:26 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:04.836 10:23:26 -- common/autotest_common.sh@850 -- # return 0 00:05:04.836 10:23:26 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:05.095 10:23:27 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 185680 00:05:05.095 10:23:27 -- common/autotest_common.sh@936 -- # '[' -z 185680 ']' 00:05:05.095 10:23:27 -- common/autotest_common.sh@940 -- # kill -0 185680 00:05:05.095 10:23:27 -- common/autotest_common.sh@941 -- # uname 00:05:05.095 10:23:27 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:05.095 10:23:27 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 185680 00:05:05.095 10:23:27 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:05.096 10:23:27 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:05.096 10:23:27 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 185680' 00:05:05.096 killing process with pid 185680 00:05:05.096 10:23:27 -- common/autotest_common.sh@955 -- # kill 185680 00:05:05.096 10:23:27 -- common/autotest_common.sh@960 -- # wait 185680 00:05:05.355 00:05:05.355 real 0m1.518s 00:05:05.355 user 0m1.613s 00:05:05.355 sys 0m0.466s 00:05:05.355 10:23:27 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:05.355 10:23:27 -- common/autotest_common.sh@10 -- # set +x 00:05:05.355 ************************************ 00:05:05.355 END TEST alias_rpc 00:05:05.355 ************************************ 00:05:05.355 10:23:27 -- spdk/autotest.sh@172 -- # [[ 0 -eq 0 ]] 00:05:05.355 10:23:27 -- spdk/autotest.sh@173 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:05.355 10:23:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:05.355 10:23:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:05.355 10:23:27 -- common/autotest_common.sh@10 -- # set +x 00:05:05.615 ************************************ 00:05:05.615 START TEST spdkcli_tcp 00:05:05.615 ************************************ 00:05:05.615 10:23:27 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:05.615 * Looking for test storage... 00:05:05.615 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:05.615 10:23:27 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:05.615 10:23:27 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:05.615 10:23:27 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:05.615 10:23:27 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:05.615 10:23:27 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:05.615 10:23:27 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:05.615 10:23:27 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:05.615 10:23:27 -- common/autotest_common.sh@710 -- # xtrace_disable 00:05:05.615 10:23:27 -- common/autotest_common.sh@10 -- # set +x 00:05:05.615 10:23:27 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=185921 00:05:05.615 10:23:27 -- spdkcli/tcp.sh@27 -- # waitforlisten 185921 00:05:05.615 10:23:27 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:05.615 10:23:27 -- common/autotest_common.sh@817 -- # '[' -z 185921 ']' 00:05:05.615 10:23:27 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:05.615 10:23:27 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:05.615 10:23:27 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:05.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:05.615 10:23:27 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:05.615 10:23:27 -- common/autotest_common.sh@10 -- # set +x 00:05:05.615 [2024-04-19 10:23:27.670644] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:05.615 [2024-04-19 10:23:27.670732] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid185921 ] 00:05:05.615 EAL: No free 2048 kB hugepages reported on node 1 00:05:05.875 [2024-04-19 10:23:27.756513] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:05.875 [2024-04-19 10:23:27.833476] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:05.875 [2024-04-19 10:23:27.833476] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:06.444 10:23:28 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:06.444 10:23:28 -- common/autotest_common.sh@850 -- # return 0 00:05:06.444 10:23:28 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:06.444 10:23:28 -- spdkcli/tcp.sh@31 -- # socat_pid=186092 00:05:06.444 10:23:28 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:06.704 [ 00:05:06.704 "spdk_get_version", 00:05:06.704 "rpc_get_methods", 00:05:06.704 "trace_get_info", 00:05:06.704 "trace_get_tpoint_group_mask", 00:05:06.704 "trace_disable_tpoint_group", 00:05:06.704 "trace_enable_tpoint_group", 00:05:06.704 "trace_clear_tpoint_mask", 00:05:06.704 "trace_set_tpoint_mask", 00:05:06.704 "vfu_tgt_set_base_path", 00:05:06.704 "framework_get_pci_devices", 00:05:06.704 "framework_get_config", 00:05:06.704 "framework_get_subsystems", 00:05:06.704 "keyring_get_keys", 00:05:06.704 "iobuf_get_stats", 00:05:06.704 "iobuf_set_options", 00:05:06.704 "sock_set_default_impl", 00:05:06.704 "sock_impl_set_options", 00:05:06.704 "sock_impl_get_options", 00:05:06.704 "vmd_rescan", 00:05:06.704 "vmd_remove_device", 00:05:06.704 "vmd_enable", 00:05:06.704 "accel_get_stats", 00:05:06.704 "accel_set_options", 00:05:06.704 "accel_set_driver", 00:05:06.704 "accel_crypto_key_destroy", 00:05:06.704 "accel_crypto_keys_get", 00:05:06.704 "accel_crypto_key_create", 00:05:06.704 "accel_assign_opc", 00:05:06.704 "accel_get_module_info", 00:05:06.704 "accel_get_opc_assignments", 00:05:06.704 "notify_get_notifications", 00:05:06.704 "notify_get_types", 00:05:06.704 "bdev_get_histogram", 00:05:06.704 "bdev_enable_histogram", 00:05:06.704 "bdev_set_qos_limit", 00:05:06.704 "bdev_set_qd_sampling_period", 00:05:06.704 "bdev_get_bdevs", 00:05:06.704 "bdev_reset_iostat", 00:05:06.704 "bdev_get_iostat", 00:05:06.704 "bdev_examine", 00:05:06.704 "bdev_wait_for_examine", 00:05:06.704 "bdev_set_options", 00:05:06.704 "scsi_get_devices", 00:05:06.704 "thread_set_cpumask", 00:05:06.704 "framework_get_scheduler", 00:05:06.704 "framework_set_scheduler", 00:05:06.704 "framework_get_reactors", 00:05:06.704 "thread_get_io_channels", 00:05:06.704 "thread_get_pollers", 00:05:06.704 "thread_get_stats", 00:05:06.704 "framework_monitor_context_switch", 00:05:06.704 "spdk_kill_instance", 00:05:06.704 "log_enable_timestamps", 00:05:06.704 "log_get_flags", 00:05:06.704 "log_clear_flag", 00:05:06.704 "log_set_flag", 00:05:06.704 "log_get_level", 00:05:06.704 "log_set_level", 00:05:06.704 "log_get_print_level", 00:05:06.704 "log_set_print_level", 00:05:06.704 "framework_enable_cpumask_locks", 00:05:06.704 "framework_disable_cpumask_locks", 00:05:06.704 "framework_wait_init", 00:05:06.704 "framework_start_init", 00:05:06.704 "virtio_blk_create_transport", 00:05:06.704 "virtio_blk_get_transports", 00:05:06.704 "vhost_controller_set_coalescing", 00:05:06.704 "vhost_get_controllers", 00:05:06.704 "vhost_delete_controller", 00:05:06.704 "vhost_create_blk_controller", 00:05:06.704 "vhost_scsi_controller_remove_target", 00:05:06.704 "vhost_scsi_controller_add_target", 00:05:06.704 "vhost_start_scsi_controller", 00:05:06.704 "vhost_create_scsi_controller", 00:05:06.704 "ublk_recover_disk", 00:05:06.704 "ublk_get_disks", 00:05:06.704 "ublk_stop_disk", 00:05:06.704 "ublk_start_disk", 00:05:06.704 "ublk_destroy_target", 00:05:06.704 "ublk_create_target", 00:05:06.704 "nbd_get_disks", 00:05:06.704 "nbd_stop_disk", 00:05:06.704 "nbd_start_disk", 00:05:06.704 "env_dpdk_get_mem_stats", 00:05:06.704 "nvmf_subsystem_get_listeners", 00:05:06.704 "nvmf_subsystem_get_qpairs", 00:05:06.704 "nvmf_subsystem_get_controllers", 00:05:06.704 "nvmf_get_stats", 00:05:06.704 "nvmf_get_transports", 00:05:06.704 "nvmf_create_transport", 00:05:06.704 "nvmf_get_targets", 00:05:06.704 "nvmf_delete_target", 00:05:06.704 "nvmf_create_target", 00:05:06.704 "nvmf_subsystem_allow_any_host", 00:05:06.704 "nvmf_subsystem_remove_host", 00:05:06.704 "nvmf_subsystem_add_host", 00:05:06.704 "nvmf_ns_remove_host", 00:05:06.704 "nvmf_ns_add_host", 00:05:06.704 "nvmf_subsystem_remove_ns", 00:05:06.704 "nvmf_subsystem_add_ns", 00:05:06.704 "nvmf_subsystem_listener_set_ana_state", 00:05:06.704 "nvmf_discovery_get_referrals", 00:05:06.704 "nvmf_discovery_remove_referral", 00:05:06.704 "nvmf_discovery_add_referral", 00:05:06.704 "nvmf_subsystem_remove_listener", 00:05:06.704 "nvmf_subsystem_add_listener", 00:05:06.704 "nvmf_delete_subsystem", 00:05:06.704 "nvmf_create_subsystem", 00:05:06.704 "nvmf_get_subsystems", 00:05:06.704 "nvmf_set_crdt", 00:05:06.704 "nvmf_set_config", 00:05:06.704 "nvmf_set_max_subsystems", 00:05:06.705 "iscsi_set_options", 00:05:06.705 "iscsi_get_auth_groups", 00:05:06.705 "iscsi_auth_group_remove_secret", 00:05:06.705 "iscsi_auth_group_add_secret", 00:05:06.705 "iscsi_delete_auth_group", 00:05:06.705 "iscsi_create_auth_group", 00:05:06.705 "iscsi_set_discovery_auth", 00:05:06.705 "iscsi_get_options", 00:05:06.705 "iscsi_target_node_request_logout", 00:05:06.705 "iscsi_target_node_set_redirect", 00:05:06.705 "iscsi_target_node_set_auth", 00:05:06.705 "iscsi_target_node_add_lun", 00:05:06.705 "iscsi_get_stats", 00:05:06.705 "iscsi_get_connections", 00:05:06.705 "iscsi_portal_group_set_auth", 00:05:06.705 "iscsi_start_portal_group", 00:05:06.705 "iscsi_delete_portal_group", 00:05:06.705 "iscsi_create_portal_group", 00:05:06.705 "iscsi_get_portal_groups", 00:05:06.705 "iscsi_delete_target_node", 00:05:06.705 "iscsi_target_node_remove_pg_ig_maps", 00:05:06.705 "iscsi_target_node_add_pg_ig_maps", 00:05:06.705 "iscsi_create_target_node", 00:05:06.705 "iscsi_get_target_nodes", 00:05:06.705 "iscsi_delete_initiator_group", 00:05:06.705 "iscsi_initiator_group_remove_initiators", 00:05:06.705 "iscsi_initiator_group_add_initiators", 00:05:06.705 "iscsi_create_initiator_group", 00:05:06.705 "iscsi_get_initiator_groups", 00:05:06.705 "keyring_file_remove_key", 00:05:06.705 "keyring_file_add_key", 00:05:06.705 "vfu_virtio_create_scsi_endpoint", 00:05:06.705 "vfu_virtio_scsi_remove_target", 00:05:06.705 "vfu_virtio_scsi_add_target", 00:05:06.705 "vfu_virtio_create_blk_endpoint", 00:05:06.705 "vfu_virtio_delete_endpoint", 00:05:06.705 "iaa_scan_accel_module", 00:05:06.705 "dsa_scan_accel_module", 00:05:06.705 "ioat_scan_accel_module", 00:05:06.705 "accel_error_inject_error", 00:05:06.705 "bdev_iscsi_delete", 00:05:06.705 "bdev_iscsi_create", 00:05:06.705 "bdev_iscsi_set_options", 00:05:06.705 "bdev_virtio_attach_controller", 00:05:06.705 "bdev_virtio_scsi_get_devices", 00:05:06.705 "bdev_virtio_detach_controller", 00:05:06.705 "bdev_virtio_blk_set_hotplug", 00:05:06.705 "bdev_ftl_set_property", 00:05:06.705 "bdev_ftl_get_properties", 00:05:06.705 "bdev_ftl_get_stats", 00:05:06.705 "bdev_ftl_unmap", 00:05:06.705 "bdev_ftl_unload", 00:05:06.705 "bdev_ftl_delete", 00:05:06.705 "bdev_ftl_load", 00:05:06.705 "bdev_ftl_create", 00:05:06.705 "bdev_aio_delete", 00:05:06.705 "bdev_aio_rescan", 00:05:06.705 "bdev_aio_create", 00:05:06.705 "blobfs_create", 00:05:06.705 "blobfs_detect", 00:05:06.705 "blobfs_set_cache_size", 00:05:06.705 "bdev_zone_block_delete", 00:05:06.705 "bdev_zone_block_create", 00:05:06.705 "bdev_delay_delete", 00:05:06.705 "bdev_delay_create", 00:05:06.705 "bdev_delay_update_latency", 00:05:06.705 "bdev_split_delete", 00:05:06.705 "bdev_split_create", 00:05:06.705 "bdev_error_inject_error", 00:05:06.705 "bdev_error_delete", 00:05:06.705 "bdev_error_create", 00:05:06.705 "bdev_raid_set_options", 00:05:06.705 "bdev_raid_remove_base_bdev", 00:05:06.705 "bdev_raid_add_base_bdev", 00:05:06.705 "bdev_raid_delete", 00:05:06.705 "bdev_raid_create", 00:05:06.705 "bdev_raid_get_bdevs", 00:05:06.705 "bdev_lvol_grow_lvstore", 00:05:06.705 "bdev_lvol_get_lvols", 00:05:06.705 "bdev_lvol_get_lvstores", 00:05:06.705 "bdev_lvol_delete", 00:05:06.705 "bdev_lvol_set_read_only", 00:05:06.705 "bdev_lvol_resize", 00:05:06.705 "bdev_lvol_decouple_parent", 00:05:06.705 "bdev_lvol_inflate", 00:05:06.705 "bdev_lvol_rename", 00:05:06.705 "bdev_lvol_clone_bdev", 00:05:06.705 "bdev_lvol_clone", 00:05:06.705 "bdev_lvol_snapshot", 00:05:06.705 "bdev_lvol_create", 00:05:06.705 "bdev_lvol_delete_lvstore", 00:05:06.705 "bdev_lvol_rename_lvstore", 00:05:06.705 "bdev_lvol_create_lvstore", 00:05:06.705 "bdev_passthru_delete", 00:05:06.705 "bdev_passthru_create", 00:05:06.705 "bdev_nvme_cuse_unregister", 00:05:06.705 "bdev_nvme_cuse_register", 00:05:06.705 "bdev_opal_new_user", 00:05:06.705 "bdev_opal_set_lock_state", 00:05:06.705 "bdev_opal_delete", 00:05:06.705 "bdev_opal_get_info", 00:05:06.705 "bdev_opal_create", 00:05:06.705 "bdev_nvme_opal_revert", 00:05:06.705 "bdev_nvme_opal_init", 00:05:06.705 "bdev_nvme_send_cmd", 00:05:06.705 "bdev_nvme_get_path_iostat", 00:05:06.705 "bdev_nvme_get_mdns_discovery_info", 00:05:06.705 "bdev_nvme_stop_mdns_discovery", 00:05:06.705 "bdev_nvme_start_mdns_discovery", 00:05:06.705 "bdev_nvme_set_multipath_policy", 00:05:06.705 "bdev_nvme_set_preferred_path", 00:05:06.705 "bdev_nvme_get_io_paths", 00:05:06.705 "bdev_nvme_remove_error_injection", 00:05:06.705 "bdev_nvme_add_error_injection", 00:05:06.705 "bdev_nvme_get_discovery_info", 00:05:06.705 "bdev_nvme_stop_discovery", 00:05:06.705 "bdev_nvme_start_discovery", 00:05:06.705 "bdev_nvme_get_controller_health_info", 00:05:06.705 "bdev_nvme_disable_controller", 00:05:06.705 "bdev_nvme_enable_controller", 00:05:06.705 "bdev_nvme_reset_controller", 00:05:06.705 "bdev_nvme_get_transport_statistics", 00:05:06.705 "bdev_nvme_apply_firmware", 00:05:06.705 "bdev_nvme_detach_controller", 00:05:06.705 "bdev_nvme_get_controllers", 00:05:06.705 "bdev_nvme_attach_controller", 00:05:06.705 "bdev_nvme_set_hotplug", 00:05:06.705 "bdev_nvme_set_options", 00:05:06.705 "bdev_null_resize", 00:05:06.705 "bdev_null_delete", 00:05:06.705 "bdev_null_create", 00:05:06.705 "bdev_malloc_delete", 00:05:06.705 "bdev_malloc_create" 00:05:06.705 ] 00:05:06.705 10:23:28 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:06.705 10:23:28 -- common/autotest_common.sh@716 -- # xtrace_disable 00:05:06.705 10:23:28 -- common/autotest_common.sh@10 -- # set +x 00:05:06.705 10:23:28 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:06.705 10:23:28 -- spdkcli/tcp.sh@38 -- # killprocess 185921 00:05:06.705 10:23:28 -- common/autotest_common.sh@936 -- # '[' -z 185921 ']' 00:05:06.705 10:23:28 -- common/autotest_common.sh@940 -- # kill -0 185921 00:05:06.705 10:23:28 -- common/autotest_common.sh@941 -- # uname 00:05:06.705 10:23:28 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:06.705 10:23:28 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 185921 00:05:06.705 10:23:28 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:06.705 10:23:28 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:06.705 10:23:28 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 185921' 00:05:06.705 killing process with pid 185921 00:05:06.705 10:23:28 -- common/autotest_common.sh@955 -- # kill 185921 00:05:06.705 10:23:28 -- common/autotest_common.sh@960 -- # wait 185921 00:05:06.965 00:05:06.965 real 0m1.547s 00:05:06.965 user 0m2.816s 00:05:06.965 sys 0m0.503s 00:05:06.965 10:23:29 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:06.965 10:23:29 -- common/autotest_common.sh@10 -- # set +x 00:05:06.965 ************************************ 00:05:06.965 END TEST spdkcli_tcp 00:05:06.965 ************************************ 00:05:07.225 10:23:29 -- spdk/autotest.sh@176 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:07.225 10:23:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:07.225 10:23:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:07.225 10:23:29 -- common/autotest_common.sh@10 -- # set +x 00:05:07.225 ************************************ 00:05:07.225 START TEST dpdk_mem_utility 00:05:07.225 ************************************ 00:05:07.225 10:23:29 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:07.225 * Looking for test storage... 00:05:07.225 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:07.225 10:23:29 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:07.225 10:23:29 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=186178 00:05:07.225 10:23:29 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 186178 00:05:07.225 10:23:29 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:07.485 10:23:29 -- common/autotest_common.sh@817 -- # '[' -z 186178 ']' 00:05:07.485 10:23:29 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:07.485 10:23:29 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:07.485 10:23:29 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:07.485 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:07.485 10:23:29 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:07.485 10:23:29 -- common/autotest_common.sh@10 -- # set +x 00:05:07.485 [2024-04-19 10:23:29.358612] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:07.485 [2024-04-19 10:23:29.358681] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid186178 ] 00:05:07.485 EAL: No free 2048 kB hugepages reported on node 1 00:05:07.485 [2024-04-19 10:23:29.429735] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:07.485 [2024-04-19 10:23:29.505681] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:08.422 10:23:30 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:08.422 10:23:30 -- common/autotest_common.sh@850 -- # return 0 00:05:08.422 10:23:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:08.422 10:23:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:08.422 10:23:30 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:08.422 10:23:30 -- common/autotest_common.sh@10 -- # set +x 00:05:08.422 { 00:05:08.422 "filename": "/tmp/spdk_mem_dump.txt" 00:05:08.422 } 00:05:08.422 10:23:30 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:08.422 10:23:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:08.422 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:08.422 1 heaps totaling size 814.000000 MiB 00:05:08.422 size: 814.000000 MiB heap id: 0 00:05:08.422 end heaps---------- 00:05:08.422 8 mempools totaling size 598.116089 MiB 00:05:08.422 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:08.422 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:08.422 size: 84.521057 MiB name: bdev_io_186178 00:05:08.422 size: 51.011292 MiB name: evtpool_186178 00:05:08.422 size: 50.003479 MiB name: msgpool_186178 00:05:08.422 size: 21.763794 MiB name: PDU_Pool 00:05:08.422 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:08.422 size: 0.026123 MiB name: Session_Pool 00:05:08.422 end mempools------- 00:05:08.422 6 memzones totaling size 4.142822 MiB 00:05:08.422 size: 1.000366 MiB name: RG_ring_0_186178 00:05:08.422 size: 1.000366 MiB name: RG_ring_1_186178 00:05:08.422 size: 1.000366 MiB name: RG_ring_4_186178 00:05:08.422 size: 1.000366 MiB name: RG_ring_5_186178 00:05:08.422 size: 0.125366 MiB name: RG_ring_2_186178 00:05:08.422 size: 0.015991 MiB name: RG_ring_3_186178 00:05:08.422 end memzones------- 00:05:08.422 10:23:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:08.422 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:08.422 list of free elements. size: 12.519348 MiB 00:05:08.422 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:08.422 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:08.422 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:08.422 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:08.422 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:08.422 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:08.422 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:08.422 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:08.422 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:08.422 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:08.422 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:08.422 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:08.422 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:08.422 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:08.422 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:08.422 list of standard malloc elements. size: 199.218079 MiB 00:05:08.422 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:08.422 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:08.422 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:08.422 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:08.422 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:08.422 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:08.422 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:08.422 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:08.422 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:08.422 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:08.422 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:08.422 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:08.422 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:08.422 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:08.422 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:08.422 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:08.422 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:08.422 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:08.422 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:08.422 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:08.422 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:08.422 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:08.422 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:08.422 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:08.422 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:08.422 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:08.422 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:08.422 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:08.422 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:08.422 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:08.423 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:08.423 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:08.423 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:08.423 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:08.423 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:08.423 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:08.423 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:08.423 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:08.423 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:08.423 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:08.423 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:08.423 list of memzone associated elements. size: 602.262573 MiB 00:05:08.423 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:08.423 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:08.423 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:08.423 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:08.423 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:08.423 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_186178_0 00:05:08.423 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:08.423 associated memzone info: size: 48.002930 MiB name: MP_evtpool_186178_0 00:05:08.423 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:08.423 associated memzone info: size: 48.002930 MiB name: MP_msgpool_186178_0 00:05:08.423 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:08.423 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:08.423 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:08.423 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:08.423 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:08.423 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_186178 00:05:08.423 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:08.423 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_186178 00:05:08.423 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:08.423 associated memzone info: size: 1.007996 MiB name: MP_evtpool_186178 00:05:08.423 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:08.423 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:08.423 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:08.423 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:08.423 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:08.423 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:08.423 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:08.423 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:08.423 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:08.423 associated memzone info: size: 1.000366 MiB name: RG_ring_0_186178 00:05:08.423 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:08.423 associated memzone info: size: 1.000366 MiB name: RG_ring_1_186178 00:05:08.423 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:08.423 associated memzone info: size: 1.000366 MiB name: RG_ring_4_186178 00:05:08.423 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:08.423 associated memzone info: size: 1.000366 MiB name: RG_ring_5_186178 00:05:08.423 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:08.423 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_186178 00:05:08.423 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:08.423 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:08.423 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:08.423 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:08.423 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:08.423 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:08.423 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:08.423 associated memzone info: size: 0.125366 MiB name: RG_ring_2_186178 00:05:08.423 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:08.423 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:08.423 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:08.423 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:08.423 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:08.423 associated memzone info: size: 0.015991 MiB name: RG_ring_3_186178 00:05:08.423 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:08.423 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:08.423 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:08.423 associated memzone info: size: 0.000183 MiB name: MP_msgpool_186178 00:05:08.423 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:08.423 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_186178 00:05:08.423 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:08.423 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:08.423 10:23:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:08.423 10:23:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 186178 00:05:08.423 10:23:30 -- common/autotest_common.sh@936 -- # '[' -z 186178 ']' 00:05:08.423 10:23:30 -- common/autotest_common.sh@940 -- # kill -0 186178 00:05:08.423 10:23:30 -- common/autotest_common.sh@941 -- # uname 00:05:08.423 10:23:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:08.423 10:23:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 186178 00:05:08.423 10:23:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:08.423 10:23:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:08.423 10:23:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 186178' 00:05:08.423 killing process with pid 186178 00:05:08.423 10:23:30 -- common/autotest_common.sh@955 -- # kill 186178 00:05:08.423 10:23:30 -- common/autotest_common.sh@960 -- # wait 186178 00:05:08.683 00:05:08.683 real 0m1.441s 00:05:08.683 user 0m1.465s 00:05:08.683 sys 0m0.466s 00:05:08.683 10:23:30 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:08.683 10:23:30 -- common/autotest_common.sh@10 -- # set +x 00:05:08.683 ************************************ 00:05:08.683 END TEST dpdk_mem_utility 00:05:08.683 ************************************ 00:05:08.683 10:23:30 -- spdk/autotest.sh@177 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:08.683 10:23:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:08.683 10:23:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:08.683 10:23:30 -- common/autotest_common.sh@10 -- # set +x 00:05:08.942 ************************************ 00:05:08.942 START TEST event 00:05:08.942 ************************************ 00:05:08.942 10:23:30 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:08.942 * Looking for test storage... 00:05:08.942 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:08.942 10:23:30 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:08.942 10:23:30 -- bdev/nbd_common.sh@6 -- # set -e 00:05:08.942 10:23:30 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:08.942 10:23:30 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:08.942 10:23:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:08.942 10:23:30 -- common/autotest_common.sh@10 -- # set +x 00:05:09.202 ************************************ 00:05:09.202 START TEST event_perf 00:05:09.202 ************************************ 00:05:09.202 10:23:31 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:09.202 Running I/O for 1 seconds...[2024-04-19 10:23:31.095698] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:09.202 [2024-04-19 10:23:31.095778] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid186575 ] 00:05:09.202 EAL: No free 2048 kB hugepages reported on node 1 00:05:09.202 [2024-04-19 10:23:31.184020] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:09.202 [2024-04-19 10:23:31.262435] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:09.202 [2024-04-19 10:23:31.262539] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:09.202 [2024-04-19 10:23:31.262638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:09.202 [2024-04-19 10:23:31.262638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:10.581 Running I/O for 1 seconds... 00:05:10.581 lcore 0: 189929 00:05:10.581 lcore 1: 189927 00:05:10.581 lcore 2: 189929 00:05:10.581 lcore 3: 189930 00:05:10.581 done. 00:05:10.581 00:05:10.581 real 0m1.254s 00:05:10.581 user 0m4.150s 00:05:10.581 sys 0m0.101s 00:05:10.581 10:23:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:10.581 10:23:32 -- common/autotest_common.sh@10 -- # set +x 00:05:10.581 ************************************ 00:05:10.581 END TEST event_perf 00:05:10.581 ************************************ 00:05:10.581 10:23:32 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:10.581 10:23:32 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:10.581 10:23:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:10.581 10:23:32 -- common/autotest_common.sh@10 -- # set +x 00:05:10.581 ************************************ 00:05:10.581 START TEST event_reactor 00:05:10.581 ************************************ 00:05:10.581 10:23:32 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:10.581 [2024-04-19 10:23:32.518861] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:10.581 [2024-04-19 10:23:32.518943] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid186778 ] 00:05:10.581 EAL: No free 2048 kB hugepages reported on node 1 00:05:10.581 [2024-04-19 10:23:32.606258] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:10.581 [2024-04-19 10:23:32.686649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:11.960 test_start 00:05:11.960 oneshot 00:05:11.960 tick 100 00:05:11.960 tick 100 00:05:11.960 tick 250 00:05:11.960 tick 100 00:05:11.960 tick 100 00:05:11.960 tick 100 00:05:11.960 tick 250 00:05:11.960 tick 500 00:05:11.960 tick 100 00:05:11.960 tick 100 00:05:11.960 tick 250 00:05:11.960 tick 100 00:05:11.960 tick 100 00:05:11.960 test_end 00:05:11.960 00:05:11.960 real 0m1.252s 00:05:11.960 user 0m1.148s 00:05:11.960 sys 0m0.099s 00:05:11.960 10:23:33 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:11.960 10:23:33 -- common/autotest_common.sh@10 -- # set +x 00:05:11.960 ************************************ 00:05:11.960 END TEST event_reactor 00:05:11.960 ************************************ 00:05:11.960 10:23:33 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:11.960 10:23:33 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:11.960 10:23:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:11.961 10:23:33 -- common/autotest_common.sh@10 -- # set +x 00:05:11.961 ************************************ 00:05:11.961 START TEST event_reactor_perf 00:05:11.961 ************************************ 00:05:11.961 10:23:33 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:11.961 [2024-04-19 10:23:33.939186] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:11.961 [2024-04-19 10:23:33.939295] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid186971 ] 00:05:11.961 EAL: No free 2048 kB hugepages reported on node 1 00:05:11.961 [2024-04-19 10:23:34.027119] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:12.219 [2024-04-19 10:23:34.109949] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:13.157 test_start 00:05:13.157 test_end 00:05:13.157 Performance: 912127 events per second 00:05:13.157 00:05:13.157 real 0m1.254s 00:05:13.157 user 0m1.151s 00:05:13.157 sys 0m0.099s 00:05:13.157 10:23:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:13.157 10:23:35 -- common/autotest_common.sh@10 -- # set +x 00:05:13.157 ************************************ 00:05:13.157 END TEST event_reactor_perf 00:05:13.157 ************************************ 00:05:13.157 10:23:35 -- event/event.sh@49 -- # uname -s 00:05:13.157 10:23:35 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:13.157 10:23:35 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:13.157 10:23:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:13.157 10:23:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:13.157 10:23:35 -- common/autotest_common.sh@10 -- # set +x 00:05:13.416 ************************************ 00:05:13.416 START TEST event_scheduler 00:05:13.416 ************************************ 00:05:13.416 10:23:35 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:13.416 * Looking for test storage... 00:05:13.416 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:13.416 10:23:35 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:13.416 10:23:35 -- scheduler/scheduler.sh@35 -- # scheduler_pid=187193 00:05:13.416 10:23:35 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:13.416 10:23:35 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:13.416 10:23:35 -- scheduler/scheduler.sh@37 -- # waitforlisten 187193 00:05:13.416 10:23:35 -- common/autotest_common.sh@817 -- # '[' -z 187193 ']' 00:05:13.416 10:23:35 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:13.416 10:23:35 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:13.416 10:23:35 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:13.416 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:13.416 10:23:35 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:13.416 10:23:35 -- common/autotest_common.sh@10 -- # set +x 00:05:13.416 [2024-04-19 10:23:35.483411] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:13.416 [2024-04-19 10:23:35.483480] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid187193 ] 00:05:13.417 EAL: No free 2048 kB hugepages reported on node 1 00:05:13.675 [2024-04-19 10:23:35.567051] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:13.676 [2024-04-19 10:23:35.653673] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:13.676 [2024-04-19 10:23:35.653772] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:13.676 [2024-04-19 10:23:35.653853] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:13.676 [2024-04-19 10:23:35.653854] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:14.243 10:23:36 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:14.243 10:23:36 -- common/autotest_common.sh@850 -- # return 0 00:05:14.243 10:23:36 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:14.243 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.243 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.243 POWER: Env isn't set yet! 00:05:14.243 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:14.243 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:14.243 POWER: Cannot set governor of lcore 0 to userspace 00:05:14.243 POWER: Attempting to initialise PSTAT power management... 00:05:14.243 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:14.243 POWER: Initialized successfully for lcore 0 power management 00:05:14.503 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:14.503 POWER: Initialized successfully for lcore 1 power management 00:05:14.503 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:14.503 POWER: Initialized successfully for lcore 2 power management 00:05:14.503 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:14.503 POWER: Initialized successfully for lcore 3 power management 00:05:14.503 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.503 10:23:36 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:14.503 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.503 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.503 [2024-04-19 10:23:36.464756] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:14.503 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.503 10:23:36 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:14.503 10:23:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:14.503 10:23:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:14.503 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.503 ************************************ 00:05:14.503 START TEST scheduler_create_thread 00:05:14.503 ************************************ 00:05:14.503 10:23:36 -- common/autotest_common.sh@1111 -- # scheduler_create_thread 00:05:14.503 10:23:36 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:14.503 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.503 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.503 2 00:05:14.503 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.503 10:23:36 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:14.503 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.503 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.503 3 00:05:14.503 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.503 10:23:36 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:14.503 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.503 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.762 4 00:05:14.762 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.762 10:23:36 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:14.762 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.762 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.762 5 00:05:14.762 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.762 10:23:36 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:14.762 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.762 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.762 6 00:05:14.762 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.762 10:23:36 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:14.762 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.762 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.762 7 00:05:14.763 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.763 10:23:36 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:14.763 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.763 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.763 8 00:05:14.763 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.763 10:23:36 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:14.763 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.763 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.763 9 00:05:14.763 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.763 10:23:36 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:14.763 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.763 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.763 10 00:05:14.763 10:23:36 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:14.763 10:23:36 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:14.763 10:23:36 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:14.763 10:23:36 -- common/autotest_common.sh@10 -- # set +x 00:05:15.021 10:23:37 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:15.021 10:23:37 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:15.021 10:23:37 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:15.021 10:23:37 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:15.021 10:23:37 -- common/autotest_common.sh@10 -- # set +x 00:05:15.959 10:23:37 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:15.959 10:23:37 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:15.959 10:23:37 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:15.959 10:23:37 -- common/autotest_common.sh@10 -- # set +x 00:05:16.896 10:23:38 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:16.896 10:23:38 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:16.896 10:23:38 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:16.896 10:23:38 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:16.896 10:23:38 -- common/autotest_common.sh@10 -- # set +x 00:05:17.834 10:23:39 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:17.834 00:05:17.834 real 0m3.229s 00:05:17.834 user 0m0.026s 00:05:17.834 sys 0m0.004s 00:05:17.834 10:23:39 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:17.834 10:23:39 -- common/autotest_common.sh@10 -- # set +x 00:05:17.834 ************************************ 00:05:17.834 END TEST scheduler_create_thread 00:05:17.834 ************************************ 00:05:17.834 10:23:39 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:17.834 10:23:39 -- scheduler/scheduler.sh@46 -- # killprocess 187193 00:05:17.834 10:23:39 -- common/autotest_common.sh@936 -- # '[' -z 187193 ']' 00:05:17.834 10:23:39 -- common/autotest_common.sh@940 -- # kill -0 187193 00:05:17.834 10:23:39 -- common/autotest_common.sh@941 -- # uname 00:05:17.834 10:23:39 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:17.834 10:23:39 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 187193 00:05:17.834 10:23:39 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:17.834 10:23:39 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:17.834 10:23:39 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 187193' 00:05:17.834 killing process with pid 187193 00:05:17.834 10:23:39 -- common/autotest_common.sh@955 -- # kill 187193 00:05:17.834 10:23:39 -- common/autotest_common.sh@960 -- # wait 187193 00:05:18.093 [2024-04-19 10:23:40.193941] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:18.353 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:18.353 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:18.353 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:18.353 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:18.353 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:18.353 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:18.353 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:18.353 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:18.353 00:05:18.353 real 0m5.091s 00:05:18.353 user 0m10.410s 00:05:18.353 sys 0m0.474s 00:05:18.353 10:23:40 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:18.353 10:23:40 -- common/autotest_common.sh@10 -- # set +x 00:05:18.353 ************************************ 00:05:18.353 END TEST event_scheduler 00:05:18.353 ************************************ 00:05:18.613 10:23:40 -- event/event.sh@51 -- # modprobe -n nbd 00:05:18.613 10:23:40 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:18.613 10:23:40 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:18.613 10:23:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:18.613 10:23:40 -- common/autotest_common.sh@10 -- # set +x 00:05:18.613 ************************************ 00:05:18.613 START TEST app_repeat 00:05:18.613 ************************************ 00:05:18.613 10:23:40 -- common/autotest_common.sh@1111 -- # app_repeat_test 00:05:18.613 10:23:40 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:18.613 10:23:40 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:18.613 10:23:40 -- event/event.sh@13 -- # local nbd_list 00:05:18.613 10:23:40 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:18.613 10:23:40 -- event/event.sh@14 -- # local bdev_list 00:05:18.613 10:23:40 -- event/event.sh@15 -- # local repeat_times=4 00:05:18.613 10:23:40 -- event/event.sh@17 -- # modprobe nbd 00:05:18.613 10:23:40 -- event/event.sh@19 -- # repeat_pid=187949 00:05:18.613 10:23:40 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:18.613 10:23:40 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:18.613 10:23:40 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 187949' 00:05:18.613 Process app_repeat pid: 187949 00:05:18.613 10:23:40 -- event/event.sh@23 -- # for i in {0..2} 00:05:18.613 10:23:40 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:18.613 spdk_app_start Round 0 00:05:18.613 10:23:40 -- event/event.sh@25 -- # waitforlisten 187949 /var/tmp/spdk-nbd.sock 00:05:18.613 10:23:40 -- common/autotest_common.sh@817 -- # '[' -z 187949 ']' 00:05:18.613 10:23:40 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:18.613 10:23:40 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:18.613 10:23:40 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:18.613 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:18.613 10:23:40 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:18.613 10:23:40 -- common/autotest_common.sh@10 -- # set +x 00:05:18.613 [2024-04-19 10:23:40.640364] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:18.613 [2024-04-19 10:23:40.640451] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid187949 ] 00:05:18.613 EAL: No free 2048 kB hugepages reported on node 1 00:05:18.613 [2024-04-19 10:23:40.715230] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:18.873 [2024-04-19 10:23:40.796898] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:18.873 [2024-04-19 10:23:40.796901] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:19.440 10:23:41 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:19.440 10:23:41 -- common/autotest_common.sh@850 -- # return 0 00:05:19.440 10:23:41 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:19.699 Malloc0 00:05:19.699 10:23:41 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:19.958 Malloc1 00:05:19.958 10:23:41 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@12 -- # local i 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:19.959 10:23:41 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:19.959 /dev/nbd0 00:05:19.959 10:23:42 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:19.959 10:23:42 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:19.959 10:23:42 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:05:19.959 10:23:42 -- common/autotest_common.sh@855 -- # local i 00:05:19.959 10:23:42 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:05:19.959 10:23:42 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:05:19.959 10:23:42 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:05:19.959 10:23:42 -- common/autotest_common.sh@859 -- # break 00:05:19.959 10:23:42 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:05:19.959 10:23:42 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:05:19.959 10:23:42 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:19.959 1+0 records in 00:05:19.959 1+0 records out 00:05:19.959 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000178386 s, 23.0 MB/s 00:05:19.959 10:23:42 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:20.218 10:23:42 -- common/autotest_common.sh@872 -- # size=4096 00:05:20.218 10:23:42 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:20.218 10:23:42 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:05:20.218 10:23:42 -- common/autotest_common.sh@875 -- # return 0 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:20.218 /dev/nbd1 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:20.218 10:23:42 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:05:20.218 10:23:42 -- common/autotest_common.sh@855 -- # local i 00:05:20.218 10:23:42 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:05:20.218 10:23:42 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:05:20.218 10:23:42 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:05:20.218 10:23:42 -- common/autotest_common.sh@859 -- # break 00:05:20.218 10:23:42 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:05:20.218 10:23:42 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:05:20.218 10:23:42 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:20.218 1+0 records in 00:05:20.218 1+0 records out 00:05:20.218 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264838 s, 15.5 MB/s 00:05:20.218 10:23:42 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:20.218 10:23:42 -- common/autotest_common.sh@872 -- # size=4096 00:05:20.218 10:23:42 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:20.218 10:23:42 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:05:20.218 10:23:42 -- common/autotest_common.sh@875 -- # return 0 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:20.218 10:23:42 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:20.477 { 00:05:20.477 "nbd_device": "/dev/nbd0", 00:05:20.477 "bdev_name": "Malloc0" 00:05:20.477 }, 00:05:20.477 { 00:05:20.477 "nbd_device": "/dev/nbd1", 00:05:20.477 "bdev_name": "Malloc1" 00:05:20.477 } 00:05:20.477 ]' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:20.477 { 00:05:20.477 "nbd_device": "/dev/nbd0", 00:05:20.477 "bdev_name": "Malloc0" 00:05:20.477 }, 00:05:20.477 { 00:05:20.477 "nbd_device": "/dev/nbd1", 00:05:20.477 "bdev_name": "Malloc1" 00:05:20.477 } 00:05:20.477 ]' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:20.477 /dev/nbd1' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:20.477 /dev/nbd1' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@65 -- # count=2 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@95 -- # count=2 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:20.477 256+0 records in 00:05:20.477 256+0 records out 00:05:20.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102106 s, 103 MB/s 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:20.477 256+0 records in 00:05:20.477 256+0 records out 00:05:20.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210855 s, 49.7 MB/s 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:20.477 256+0 records in 00:05:20.477 256+0 records out 00:05:20.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.022245 s, 47.1 MB/s 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@51 -- # local i 00:05:20.477 10:23:42 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:20.478 10:23:42 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@41 -- # break 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@45 -- # return 0 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:20.736 10:23:42 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@41 -- # break 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@45 -- # return 0 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:20.995 10:23:42 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@65 -- # true 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@65 -- # count=0 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@104 -- # count=0 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:21.256 10:23:43 -- bdev/nbd_common.sh@109 -- # return 0 00:05:21.257 10:23:43 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:21.515 10:23:43 -- event/event.sh@35 -- # sleep 3 00:05:21.515 [2024-04-19 10:23:43.560393] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:21.775 [2024-04-19 10:23:43.636554] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:21.775 [2024-04-19 10:23:43.636556] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:21.775 [2024-04-19 10:23:43.676398] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:21.775 [2024-04-19 10:23:43.676443] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:24.311 10:23:46 -- event/event.sh@23 -- # for i in {0..2} 00:05:24.311 10:23:46 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:24.311 spdk_app_start Round 1 00:05:24.311 10:23:46 -- event/event.sh@25 -- # waitforlisten 187949 /var/tmp/spdk-nbd.sock 00:05:24.311 10:23:46 -- common/autotest_common.sh@817 -- # '[' -z 187949 ']' 00:05:24.311 10:23:46 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:24.311 10:23:46 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:24.311 10:23:46 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:24.311 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:24.311 10:23:46 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:24.311 10:23:46 -- common/autotest_common.sh@10 -- # set +x 00:05:24.570 10:23:46 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:24.570 10:23:46 -- common/autotest_common.sh@850 -- # return 0 00:05:24.570 10:23:46 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:24.830 Malloc0 00:05:24.830 10:23:46 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:24.830 Malloc1 00:05:24.830 10:23:46 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@12 -- # local i 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:24.830 10:23:46 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:25.089 /dev/nbd0 00:05:25.089 10:23:47 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:25.090 10:23:47 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:25.090 10:23:47 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:05:25.090 10:23:47 -- common/autotest_common.sh@855 -- # local i 00:05:25.090 10:23:47 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:05:25.090 10:23:47 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:05:25.090 10:23:47 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:05:25.090 10:23:47 -- common/autotest_common.sh@859 -- # break 00:05:25.090 10:23:47 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:05:25.090 10:23:47 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:05:25.090 10:23:47 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:25.090 1+0 records in 00:05:25.090 1+0 records out 00:05:25.090 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225786 s, 18.1 MB/s 00:05:25.090 10:23:47 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:25.090 10:23:47 -- common/autotest_common.sh@872 -- # size=4096 00:05:25.090 10:23:47 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:25.090 10:23:47 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:05:25.090 10:23:47 -- common/autotest_common.sh@875 -- # return 0 00:05:25.090 10:23:47 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:25.090 10:23:47 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:25.090 10:23:47 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:25.349 /dev/nbd1 00:05:25.349 10:23:47 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:25.349 10:23:47 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:25.349 10:23:47 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:05:25.349 10:23:47 -- common/autotest_common.sh@855 -- # local i 00:05:25.349 10:23:47 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:05:25.349 10:23:47 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:05:25.349 10:23:47 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:05:25.349 10:23:47 -- common/autotest_common.sh@859 -- # break 00:05:25.349 10:23:47 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:05:25.349 10:23:47 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:05:25.349 10:23:47 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:25.349 1+0 records in 00:05:25.349 1+0 records out 00:05:25.349 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250647 s, 16.3 MB/s 00:05:25.349 10:23:47 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:25.349 10:23:47 -- common/autotest_common.sh@872 -- # size=4096 00:05:25.349 10:23:47 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:25.349 10:23:47 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:05:25.349 10:23:47 -- common/autotest_common.sh@875 -- # return 0 00:05:25.349 10:23:47 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:25.349 10:23:47 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:25.349 10:23:47 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:25.349 10:23:47 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:25.349 10:23:47 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:25.608 { 00:05:25.608 "nbd_device": "/dev/nbd0", 00:05:25.608 "bdev_name": "Malloc0" 00:05:25.608 }, 00:05:25.608 { 00:05:25.608 "nbd_device": "/dev/nbd1", 00:05:25.608 "bdev_name": "Malloc1" 00:05:25.608 } 00:05:25.608 ]' 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:25.608 { 00:05:25.608 "nbd_device": "/dev/nbd0", 00:05:25.608 "bdev_name": "Malloc0" 00:05:25.608 }, 00:05:25.608 { 00:05:25.608 "nbd_device": "/dev/nbd1", 00:05:25.608 "bdev_name": "Malloc1" 00:05:25.608 } 00:05:25.608 ]' 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:25.608 /dev/nbd1' 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:25.608 /dev/nbd1' 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@65 -- # count=2 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@95 -- # count=2 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:25.608 10:23:47 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:25.609 256+0 records in 00:05:25.609 256+0 records out 00:05:25.609 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111663 s, 93.9 MB/s 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:25.609 256+0 records in 00:05:25.609 256+0 records out 00:05:25.609 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020524 s, 51.1 MB/s 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:25.609 256+0 records in 00:05:25.609 256+0 records out 00:05:25.609 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0218228 s, 48.0 MB/s 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@51 -- # local i 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:25.609 10:23:47 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@41 -- # break 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@45 -- # return 0 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:25.868 10:23:47 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@41 -- # break 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@45 -- # return 0 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:26.127 10:23:48 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@65 -- # true 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@65 -- # count=0 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@104 -- # count=0 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:26.386 10:23:48 -- bdev/nbd_common.sh@109 -- # return 0 00:05:26.386 10:23:48 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:26.386 10:23:48 -- event/event.sh@35 -- # sleep 3 00:05:26.645 [2024-04-19 10:23:48.664964] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:26.645 [2024-04-19 10:23:48.738081] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:26.645 [2024-04-19 10:23:48.738084] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:26.905 [2024-04-19 10:23:48.778897] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:26.905 [2024-04-19 10:23:48.778940] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:29.439 10:23:51 -- event/event.sh@23 -- # for i in {0..2} 00:05:29.439 10:23:51 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:29.439 spdk_app_start Round 2 00:05:29.439 10:23:51 -- event/event.sh@25 -- # waitforlisten 187949 /var/tmp/spdk-nbd.sock 00:05:29.439 10:23:51 -- common/autotest_common.sh@817 -- # '[' -z 187949 ']' 00:05:29.439 10:23:51 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:29.439 10:23:51 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:29.439 10:23:51 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:29.439 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:29.439 10:23:51 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:29.439 10:23:51 -- common/autotest_common.sh@10 -- # set +x 00:05:29.698 10:23:51 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:29.698 10:23:51 -- common/autotest_common.sh@850 -- # return 0 00:05:29.698 10:23:51 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:29.969 Malloc0 00:05:29.969 10:23:51 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:29.969 Malloc1 00:05:29.969 10:23:51 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@12 -- # local i 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:29.969 10:23:51 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:30.228 /dev/nbd0 00:05:30.228 10:23:52 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:30.228 10:23:52 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:30.228 10:23:52 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:05:30.228 10:23:52 -- common/autotest_common.sh@855 -- # local i 00:05:30.228 10:23:52 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:05:30.228 10:23:52 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:05:30.228 10:23:52 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:05:30.228 10:23:52 -- common/autotest_common.sh@859 -- # break 00:05:30.228 10:23:52 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:05:30.228 10:23:52 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:05:30.228 10:23:52 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:30.228 1+0 records in 00:05:30.228 1+0 records out 00:05:30.228 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247655 s, 16.5 MB/s 00:05:30.228 10:23:52 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:30.228 10:23:52 -- common/autotest_common.sh@872 -- # size=4096 00:05:30.228 10:23:52 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:30.228 10:23:52 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:05:30.228 10:23:52 -- common/autotest_common.sh@875 -- # return 0 00:05:30.228 10:23:52 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:30.228 10:23:52 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:30.228 10:23:52 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:30.487 /dev/nbd1 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:30.487 10:23:52 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:05:30.487 10:23:52 -- common/autotest_common.sh@855 -- # local i 00:05:30.487 10:23:52 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:05:30.487 10:23:52 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:05:30.487 10:23:52 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:05:30.487 10:23:52 -- common/autotest_common.sh@859 -- # break 00:05:30.487 10:23:52 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:05:30.487 10:23:52 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:05:30.487 10:23:52 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:30.487 1+0 records in 00:05:30.487 1+0 records out 00:05:30.487 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248998 s, 16.4 MB/s 00:05:30.487 10:23:52 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:30.487 10:23:52 -- common/autotest_common.sh@872 -- # size=4096 00:05:30.487 10:23:52 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:30.487 10:23:52 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:05:30.487 10:23:52 -- common/autotest_common.sh@875 -- # return 0 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:30.487 { 00:05:30.487 "nbd_device": "/dev/nbd0", 00:05:30.487 "bdev_name": "Malloc0" 00:05:30.487 }, 00:05:30.487 { 00:05:30.487 "nbd_device": "/dev/nbd1", 00:05:30.487 "bdev_name": "Malloc1" 00:05:30.487 } 00:05:30.487 ]' 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:30.487 { 00:05:30.487 "nbd_device": "/dev/nbd0", 00:05:30.487 "bdev_name": "Malloc0" 00:05:30.487 }, 00:05:30.487 { 00:05:30.487 "nbd_device": "/dev/nbd1", 00:05:30.487 "bdev_name": "Malloc1" 00:05:30.487 } 00:05:30.487 ]' 00:05:30.487 10:23:52 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:30.804 /dev/nbd1' 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:30.804 /dev/nbd1' 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@65 -- # count=2 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@95 -- # count=2 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:30.804 256+0 records in 00:05:30.804 256+0 records out 00:05:30.804 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112916 s, 92.9 MB/s 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:30.804 256+0 records in 00:05:30.804 256+0 records out 00:05:30.804 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020697 s, 50.7 MB/s 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:30.804 256+0 records in 00:05:30.804 256+0 records out 00:05:30.804 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0218159 s, 48.1 MB/s 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@51 -- # local i 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:30.804 10:23:52 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:31.064 10:23:52 -- bdev/nbd_common.sh@41 -- # break 00:05:31.064 10:23:52 -- bdev/nbd_common.sh@45 -- # return 0 00:05:31.064 10:23:52 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:31.064 10:23:52 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@41 -- # break 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@45 -- # return 0 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:31.064 10:23:53 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@65 -- # true 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@65 -- # count=0 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@104 -- # count=0 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:31.324 10:23:53 -- bdev/nbd_common.sh@109 -- # return 0 00:05:31.324 10:23:53 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:31.584 10:23:53 -- event/event.sh@35 -- # sleep 3 00:05:31.844 [2024-04-19 10:23:53.701916] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:31.844 [2024-04-19 10:23:53.775952] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:31.844 [2024-04-19 10:23:53.775954] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:31.844 [2024-04-19 10:23:53.815634] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:31.844 [2024-04-19 10:23:53.815683] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:35.135 10:23:56 -- event/event.sh@38 -- # waitforlisten 187949 /var/tmp/spdk-nbd.sock 00:05:35.135 10:23:56 -- common/autotest_common.sh@817 -- # '[' -z 187949 ']' 00:05:35.135 10:23:56 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:35.135 10:23:56 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:35.135 10:23:56 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:35.135 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:35.135 10:23:56 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:35.135 10:23:56 -- common/autotest_common.sh@10 -- # set +x 00:05:35.135 10:23:56 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:35.135 10:23:56 -- common/autotest_common.sh@850 -- # return 0 00:05:35.135 10:23:56 -- event/event.sh@39 -- # killprocess 187949 00:05:35.135 10:23:56 -- common/autotest_common.sh@936 -- # '[' -z 187949 ']' 00:05:35.135 10:23:56 -- common/autotest_common.sh@940 -- # kill -0 187949 00:05:35.135 10:23:56 -- common/autotest_common.sh@941 -- # uname 00:05:35.135 10:23:56 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:35.135 10:23:56 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 187949 00:05:35.135 10:23:56 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:35.135 10:23:56 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:35.135 10:23:56 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 187949' 00:05:35.135 killing process with pid 187949 00:05:35.135 10:23:56 -- common/autotest_common.sh@955 -- # kill 187949 00:05:35.135 10:23:56 -- common/autotest_common.sh@960 -- # wait 187949 00:05:35.135 spdk_app_start is called in Round 0. 00:05:35.135 Shutdown signal received, stop current app iteration 00:05:35.135 Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 reinitialization... 00:05:35.135 spdk_app_start is called in Round 1. 00:05:35.135 Shutdown signal received, stop current app iteration 00:05:35.135 Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 reinitialization... 00:05:35.135 spdk_app_start is called in Round 2. 00:05:35.135 Shutdown signal received, stop current app iteration 00:05:35.135 Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 reinitialization... 00:05:35.135 spdk_app_start is called in Round 3. 00:05:35.135 Shutdown signal received, stop current app iteration 00:05:35.135 10:23:56 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:35.135 10:23:56 -- event/event.sh@42 -- # return 0 00:05:35.135 00:05:35.135 real 0m16.288s 00:05:35.135 user 0m34.461s 00:05:35.135 sys 0m3.126s 00:05:35.135 10:23:56 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:35.135 10:23:56 -- common/autotest_common.sh@10 -- # set +x 00:05:35.135 ************************************ 00:05:35.135 END TEST app_repeat 00:05:35.135 ************************************ 00:05:35.135 10:23:56 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:35.135 10:23:56 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:35.135 10:23:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:35.135 10:23:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.135 10:23:56 -- common/autotest_common.sh@10 -- # set +x 00:05:35.135 ************************************ 00:05:35.135 START TEST cpu_locks 00:05:35.135 ************************************ 00:05:35.135 10:23:57 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:35.135 * Looking for test storage... 00:05:35.135 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:35.135 10:23:57 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:35.135 10:23:57 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:35.135 10:23:57 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:35.135 10:23:57 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:35.135 10:23:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:35.135 10:23:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.135 10:23:57 -- common/autotest_common.sh@10 -- # set +x 00:05:35.395 ************************************ 00:05:35.395 START TEST default_locks 00:05:35.395 ************************************ 00:05:35.395 10:23:57 -- common/autotest_common.sh@1111 -- # default_locks 00:05:35.395 10:23:57 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=190255 00:05:35.395 10:23:57 -- event/cpu_locks.sh@47 -- # waitforlisten 190255 00:05:35.395 10:23:57 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:35.395 10:23:57 -- common/autotest_common.sh@817 -- # '[' -z 190255 ']' 00:05:35.395 10:23:57 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.395 10:23:57 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:35.395 10:23:57 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.395 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.395 10:23:57 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:35.395 10:23:57 -- common/autotest_common.sh@10 -- # set +x 00:05:35.395 [2024-04-19 10:23:57.284876] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:35.395 [2024-04-19 10:23:57.284943] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190255 ] 00:05:35.395 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.395 [2024-04-19 10:23:57.355368] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.395 [2024-04-19 10:23:57.430740] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.332 10:23:58 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:36.332 10:23:58 -- common/autotest_common.sh@850 -- # return 0 00:05:36.332 10:23:58 -- event/cpu_locks.sh@49 -- # locks_exist 190255 00:05:36.332 10:23:58 -- event/cpu_locks.sh@22 -- # lslocks -p 190255 00:05:36.332 10:23:58 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:36.900 lslocks: write error 00:05:36.900 10:23:58 -- event/cpu_locks.sh@50 -- # killprocess 190255 00:05:36.900 10:23:58 -- common/autotest_common.sh@936 -- # '[' -z 190255 ']' 00:05:36.900 10:23:58 -- common/autotest_common.sh@940 -- # kill -0 190255 00:05:36.900 10:23:58 -- common/autotest_common.sh@941 -- # uname 00:05:36.900 10:23:58 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:36.900 10:23:58 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 190255 00:05:36.900 10:23:58 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:36.900 10:23:58 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:36.900 10:23:58 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 190255' 00:05:36.900 killing process with pid 190255 00:05:36.900 10:23:58 -- common/autotest_common.sh@955 -- # kill 190255 00:05:36.900 10:23:58 -- common/autotest_common.sh@960 -- # wait 190255 00:05:37.160 10:23:59 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 190255 00:05:37.160 10:23:59 -- common/autotest_common.sh@638 -- # local es=0 00:05:37.160 10:23:59 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 190255 00:05:37.160 10:23:59 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:05:37.160 10:23:59 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:37.160 10:23:59 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:05:37.160 10:23:59 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:37.160 10:23:59 -- common/autotest_common.sh@641 -- # waitforlisten 190255 00:05:37.160 10:23:59 -- common/autotest_common.sh@817 -- # '[' -z 190255 ']' 00:05:37.160 10:23:59 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:37.160 10:23:59 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:37.160 10:23:59 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:37.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:37.160 10:23:59 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:37.160 10:23:59 -- common/autotest_common.sh@10 -- # set +x 00:05:37.160 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (190255) - No such process 00:05:37.160 ERROR: process (pid: 190255) is no longer running 00:05:37.160 10:23:59 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:37.160 10:23:59 -- common/autotest_common.sh@850 -- # return 1 00:05:37.160 10:23:59 -- common/autotest_common.sh@641 -- # es=1 00:05:37.160 10:23:59 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:37.160 10:23:59 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:05:37.160 10:23:59 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:37.160 10:23:59 -- event/cpu_locks.sh@54 -- # no_locks 00:05:37.160 10:23:59 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:37.160 10:23:59 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:37.160 10:23:59 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:37.160 00:05:37.160 real 0m1.829s 00:05:37.160 user 0m1.927s 00:05:37.160 sys 0m0.702s 00:05:37.160 10:23:59 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:37.160 10:23:59 -- common/autotest_common.sh@10 -- # set +x 00:05:37.160 ************************************ 00:05:37.160 END TEST default_locks 00:05:37.160 ************************************ 00:05:37.160 10:23:59 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:37.160 10:23:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:37.160 10:23:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.160 10:23:59 -- common/autotest_common.sh@10 -- # set +x 00:05:37.160 ************************************ 00:05:37.160 START TEST default_locks_via_rpc 00:05:37.160 ************************************ 00:05:37.160 10:23:59 -- common/autotest_common.sh@1111 -- # default_locks_via_rpc 00:05:37.160 10:23:59 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=190629 00:05:37.160 10:23:59 -- event/cpu_locks.sh@63 -- # waitforlisten 190629 00:05:37.160 10:23:59 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:37.160 10:23:59 -- common/autotest_common.sh@817 -- # '[' -z 190629 ']' 00:05:37.160 10:23:59 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:37.160 10:23:59 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:37.160 10:23:59 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:37.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:37.160 10:23:59 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:37.160 10:23:59 -- common/autotest_common.sh@10 -- # set +x 00:05:37.420 [2024-04-19 10:23:59.271422] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:37.420 [2024-04-19 10:23:59.271485] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190629 ] 00:05:37.420 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.420 [2024-04-19 10:23:59.342109] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.420 [2024-04-19 10:23:59.418710] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.358 10:24:00 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:38.358 10:24:00 -- common/autotest_common.sh@850 -- # return 0 00:05:38.358 10:24:00 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:38.358 10:24:00 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:38.358 10:24:00 -- common/autotest_common.sh@10 -- # set +x 00:05:38.358 10:24:00 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:38.358 10:24:00 -- event/cpu_locks.sh@67 -- # no_locks 00:05:38.358 10:24:00 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:38.358 10:24:00 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:38.358 10:24:00 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:38.358 10:24:00 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:38.358 10:24:00 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:38.358 10:24:00 -- common/autotest_common.sh@10 -- # set +x 00:05:38.358 10:24:00 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:38.358 10:24:00 -- event/cpu_locks.sh@71 -- # locks_exist 190629 00:05:38.358 10:24:00 -- event/cpu_locks.sh@22 -- # lslocks -p 190629 00:05:38.358 10:24:00 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:38.926 10:24:00 -- event/cpu_locks.sh@73 -- # killprocess 190629 00:05:38.926 10:24:00 -- common/autotest_common.sh@936 -- # '[' -z 190629 ']' 00:05:38.926 10:24:00 -- common/autotest_common.sh@940 -- # kill -0 190629 00:05:38.926 10:24:00 -- common/autotest_common.sh@941 -- # uname 00:05:38.926 10:24:00 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:38.926 10:24:00 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 190629 00:05:38.926 10:24:00 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:38.926 10:24:00 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:38.926 10:24:00 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 190629' 00:05:38.926 killing process with pid 190629 00:05:38.926 10:24:00 -- common/autotest_common.sh@955 -- # kill 190629 00:05:38.926 10:24:00 -- common/autotest_common.sh@960 -- # wait 190629 00:05:39.185 00:05:39.185 real 0m1.846s 00:05:39.185 user 0m1.943s 00:05:39.185 sys 0m0.662s 00:05:39.185 10:24:01 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:39.185 10:24:01 -- common/autotest_common.sh@10 -- # set +x 00:05:39.185 ************************************ 00:05:39.185 END TEST default_locks_via_rpc 00:05:39.185 ************************************ 00:05:39.185 10:24:01 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:39.185 10:24:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.185 10:24:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.185 10:24:01 -- common/autotest_common.sh@10 -- # set +x 00:05:39.185 ************************************ 00:05:39.185 START TEST non_locking_app_on_locked_coremask 00:05:39.185 ************************************ 00:05:39.185 10:24:01 -- common/autotest_common.sh@1111 -- # non_locking_app_on_locked_coremask 00:05:39.185 10:24:01 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:39.185 10:24:01 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=190933 00:05:39.185 10:24:01 -- event/cpu_locks.sh@81 -- # waitforlisten 190933 /var/tmp/spdk.sock 00:05:39.185 10:24:01 -- common/autotest_common.sh@817 -- # '[' -z 190933 ']' 00:05:39.185 10:24:01 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:39.185 10:24:01 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:39.185 10:24:01 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:39.185 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:39.185 10:24:01 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:39.185 10:24:01 -- common/autotest_common.sh@10 -- # set +x 00:05:39.185 [2024-04-19 10:24:01.246002] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:39.185 [2024-04-19 10:24:01.246055] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190933 ] 00:05:39.185 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.444 [2024-04-19 10:24:01.313264] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:39.444 [2024-04-19 10:24:01.398130] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.012 10:24:02 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:40.013 10:24:02 -- common/autotest_common.sh@850 -- # return 0 00:05:40.013 10:24:02 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=191138 00:05:40.013 10:24:02 -- event/cpu_locks.sh@85 -- # waitforlisten 191138 /var/tmp/spdk2.sock 00:05:40.013 10:24:02 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:40.013 10:24:02 -- common/autotest_common.sh@817 -- # '[' -z 191138 ']' 00:05:40.013 10:24:02 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:40.013 10:24:02 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:40.013 10:24:02 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:40.013 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:40.013 10:24:02 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:40.013 10:24:02 -- common/autotest_common.sh@10 -- # set +x 00:05:40.013 [2024-04-19 10:24:02.097814] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:40.013 [2024-04-19 10:24:02.097907] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191138 ] 00:05:40.272 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.272 [2024-04-19 10:24:02.190496] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:40.272 [2024-04-19 10:24:02.190525] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.272 [2024-04-19 10:24:02.355451] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.840 10:24:02 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:40.840 10:24:02 -- common/autotest_common.sh@850 -- # return 0 00:05:40.840 10:24:02 -- event/cpu_locks.sh@87 -- # locks_exist 190933 00:05:40.840 10:24:02 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:40.840 10:24:02 -- event/cpu_locks.sh@22 -- # lslocks -p 190933 00:05:42.222 lslocks: write error 00:05:42.222 10:24:04 -- event/cpu_locks.sh@89 -- # killprocess 190933 00:05:42.222 10:24:04 -- common/autotest_common.sh@936 -- # '[' -z 190933 ']' 00:05:42.222 10:24:04 -- common/autotest_common.sh@940 -- # kill -0 190933 00:05:42.222 10:24:04 -- common/autotest_common.sh@941 -- # uname 00:05:42.222 10:24:04 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:42.222 10:24:04 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 190933 00:05:42.222 10:24:04 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:42.222 10:24:04 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:42.222 10:24:04 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 190933' 00:05:42.222 killing process with pid 190933 00:05:42.222 10:24:04 -- common/autotest_common.sh@955 -- # kill 190933 00:05:42.222 10:24:04 -- common/autotest_common.sh@960 -- # wait 190933 00:05:42.789 10:24:04 -- event/cpu_locks.sh@90 -- # killprocess 191138 00:05:42.789 10:24:04 -- common/autotest_common.sh@936 -- # '[' -z 191138 ']' 00:05:42.789 10:24:04 -- common/autotest_common.sh@940 -- # kill -0 191138 00:05:42.789 10:24:04 -- common/autotest_common.sh@941 -- # uname 00:05:42.789 10:24:04 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:42.789 10:24:04 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 191138 00:05:42.789 10:24:04 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:42.789 10:24:04 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:42.789 10:24:04 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 191138' 00:05:42.789 killing process with pid 191138 00:05:42.789 10:24:04 -- common/autotest_common.sh@955 -- # kill 191138 00:05:42.789 10:24:04 -- common/autotest_common.sh@960 -- # wait 191138 00:05:43.047 00:05:43.047 real 0m3.823s 00:05:43.047 user 0m4.078s 00:05:43.047 sys 0m1.249s 00:05:43.047 10:24:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:43.047 10:24:05 -- common/autotest_common.sh@10 -- # set +x 00:05:43.047 ************************************ 00:05:43.047 END TEST non_locking_app_on_locked_coremask 00:05:43.047 ************************************ 00:05:43.047 10:24:05 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:43.047 10:24:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.047 10:24:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.047 10:24:05 -- common/autotest_common.sh@10 -- # set +x 00:05:43.307 ************************************ 00:05:43.307 START TEST locking_app_on_unlocked_coremask 00:05:43.307 ************************************ 00:05:43.307 10:24:05 -- common/autotest_common.sh@1111 -- # locking_app_on_unlocked_coremask 00:05:43.307 10:24:05 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=191712 00:05:43.307 10:24:05 -- event/cpu_locks.sh@99 -- # waitforlisten 191712 /var/tmp/spdk.sock 00:05:43.307 10:24:05 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:43.307 10:24:05 -- common/autotest_common.sh@817 -- # '[' -z 191712 ']' 00:05:43.307 10:24:05 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:43.307 10:24:05 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:43.307 10:24:05 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:43.307 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:43.307 10:24:05 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:43.307 10:24:05 -- common/autotest_common.sh@10 -- # set +x 00:05:43.307 [2024-04-19 10:24:05.210785] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:43.307 [2024-04-19 10:24:05.210867] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191712 ] 00:05:43.307 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.307 [2024-04-19 10:24:05.281013] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:43.307 [2024-04-19 10:24:05.281042] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.307 [2024-04-19 10:24:05.367119] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.245 10:24:06 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:44.245 10:24:06 -- common/autotest_common.sh@850 -- # return 0 00:05:44.245 10:24:06 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:44.245 10:24:06 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=192011 00:05:44.245 10:24:06 -- event/cpu_locks.sh@103 -- # waitforlisten 192011 /var/tmp/spdk2.sock 00:05:44.245 10:24:06 -- common/autotest_common.sh@817 -- # '[' -z 192011 ']' 00:05:44.245 10:24:06 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:44.245 10:24:06 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:44.245 10:24:06 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:44.245 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:44.245 10:24:06 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:44.245 10:24:06 -- common/autotest_common.sh@10 -- # set +x 00:05:44.245 [2024-04-19 10:24:06.031846] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:44.245 [2024-04-19 10:24:06.031912] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192011 ] 00:05:44.245 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.245 [2024-04-19 10:24:06.119607] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.245 [2024-04-19 10:24:06.285010] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.813 10:24:06 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:44.813 10:24:06 -- common/autotest_common.sh@850 -- # return 0 00:05:44.813 10:24:06 -- event/cpu_locks.sh@105 -- # locks_exist 192011 00:05:44.813 10:24:06 -- event/cpu_locks.sh@22 -- # lslocks -p 192011 00:05:44.813 10:24:06 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:46.191 lslocks: write error 00:05:46.191 10:24:08 -- event/cpu_locks.sh@107 -- # killprocess 191712 00:05:46.191 10:24:08 -- common/autotest_common.sh@936 -- # '[' -z 191712 ']' 00:05:46.191 10:24:08 -- common/autotest_common.sh@940 -- # kill -0 191712 00:05:46.191 10:24:08 -- common/autotest_common.sh@941 -- # uname 00:05:46.191 10:24:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:46.191 10:24:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 191712 00:05:46.191 10:24:08 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:46.191 10:24:08 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:46.191 10:24:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 191712' 00:05:46.191 killing process with pid 191712 00:05:46.191 10:24:08 -- common/autotest_common.sh@955 -- # kill 191712 00:05:46.191 10:24:08 -- common/autotest_common.sh@960 -- # wait 191712 00:05:46.761 10:24:08 -- event/cpu_locks.sh@108 -- # killprocess 192011 00:05:46.761 10:24:08 -- common/autotest_common.sh@936 -- # '[' -z 192011 ']' 00:05:46.761 10:24:08 -- common/autotest_common.sh@940 -- # kill -0 192011 00:05:46.761 10:24:08 -- common/autotest_common.sh@941 -- # uname 00:05:46.761 10:24:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:46.761 10:24:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 192011 00:05:46.761 10:24:08 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:46.761 10:24:08 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:46.761 10:24:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 192011' 00:05:46.761 killing process with pid 192011 00:05:46.761 10:24:08 -- common/autotest_common.sh@955 -- # kill 192011 00:05:46.761 10:24:08 -- common/autotest_common.sh@960 -- # wait 192011 00:05:47.021 00:05:47.021 real 0m3.837s 00:05:47.021 user 0m4.084s 00:05:47.021 sys 0m1.288s 00:05:47.021 10:24:09 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:47.021 10:24:09 -- common/autotest_common.sh@10 -- # set +x 00:05:47.021 ************************************ 00:05:47.021 END TEST locking_app_on_unlocked_coremask 00:05:47.021 ************************************ 00:05:47.021 10:24:09 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:47.021 10:24:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:47.021 10:24:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:47.021 10:24:09 -- common/autotest_common.sh@10 -- # set +x 00:05:47.281 ************************************ 00:05:47.281 START TEST locking_app_on_locked_coremask 00:05:47.281 ************************************ 00:05:47.281 10:24:09 -- common/autotest_common.sh@1111 -- # locking_app_on_locked_coremask 00:05:47.281 10:24:09 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=192576 00:05:47.281 10:24:09 -- event/cpu_locks.sh@116 -- # waitforlisten 192576 /var/tmp/spdk.sock 00:05:47.281 10:24:09 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:47.281 10:24:09 -- common/autotest_common.sh@817 -- # '[' -z 192576 ']' 00:05:47.281 10:24:09 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.281 10:24:09 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:47.281 10:24:09 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.281 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.281 10:24:09 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:47.281 10:24:09 -- common/autotest_common.sh@10 -- # set +x 00:05:47.281 [2024-04-19 10:24:09.212448] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:47.281 [2024-04-19 10:24:09.212524] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192576 ] 00:05:47.281 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.281 [2024-04-19 10:24:09.283232] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.281 [2024-04-19 10:24:09.361225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.218 10:24:10 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:48.218 10:24:10 -- common/autotest_common.sh@850 -- # return 0 00:05:48.218 10:24:10 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=192743 00:05:48.218 10:24:10 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:48.218 10:24:10 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 192743 /var/tmp/spdk2.sock 00:05:48.218 10:24:10 -- common/autotest_common.sh@638 -- # local es=0 00:05:48.218 10:24:10 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 192743 /var/tmp/spdk2.sock 00:05:48.218 10:24:10 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:05:48.218 10:24:10 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:48.218 10:24:10 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:05:48.218 10:24:10 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:48.218 10:24:10 -- common/autotest_common.sh@641 -- # waitforlisten 192743 /var/tmp/spdk2.sock 00:05:48.218 10:24:10 -- common/autotest_common.sh@817 -- # '[' -z 192743 ']' 00:05:48.218 10:24:10 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:48.218 10:24:10 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:48.218 10:24:10 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:48.218 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:48.218 10:24:10 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:48.218 10:24:10 -- common/autotest_common.sh@10 -- # set +x 00:05:48.218 [2024-04-19 10:24:10.050156] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:48.218 [2024-04-19 10:24:10.050237] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192743 ] 00:05:48.218 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.218 [2024-04-19 10:24:10.146828] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 192576 has claimed it. 00:05:48.218 [2024-04-19 10:24:10.146875] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:48.786 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (192743) - No such process 00:05:48.786 ERROR: process (pid: 192743) is no longer running 00:05:48.786 10:24:10 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:48.786 10:24:10 -- common/autotest_common.sh@850 -- # return 1 00:05:48.786 10:24:10 -- common/autotest_common.sh@641 -- # es=1 00:05:48.786 10:24:10 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:48.786 10:24:10 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:05:48.786 10:24:10 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:48.786 10:24:10 -- event/cpu_locks.sh@122 -- # locks_exist 192576 00:05:48.786 10:24:10 -- event/cpu_locks.sh@22 -- # lslocks -p 192576 00:05:48.786 10:24:10 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:49.045 lslocks: write error 00:05:49.045 10:24:11 -- event/cpu_locks.sh@124 -- # killprocess 192576 00:05:49.045 10:24:11 -- common/autotest_common.sh@936 -- # '[' -z 192576 ']' 00:05:49.045 10:24:11 -- common/autotest_common.sh@940 -- # kill -0 192576 00:05:49.045 10:24:11 -- common/autotest_common.sh@941 -- # uname 00:05:49.045 10:24:11 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:49.045 10:24:11 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 192576 00:05:49.045 10:24:11 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:49.045 10:24:11 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:49.045 10:24:11 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 192576' 00:05:49.045 killing process with pid 192576 00:05:49.045 10:24:11 -- common/autotest_common.sh@955 -- # kill 192576 00:05:49.045 10:24:11 -- common/autotest_common.sh@960 -- # wait 192576 00:05:49.617 00:05:49.617 real 0m2.237s 00:05:49.617 user 0m2.435s 00:05:49.617 sys 0m0.670s 00:05:49.617 10:24:11 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:49.617 10:24:11 -- common/autotest_common.sh@10 -- # set +x 00:05:49.617 ************************************ 00:05:49.617 END TEST locking_app_on_locked_coremask 00:05:49.617 ************************************ 00:05:49.617 10:24:11 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:49.617 10:24:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:49.617 10:24:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:49.617 10:24:11 -- common/autotest_common.sh@10 -- # set +x 00:05:49.617 ************************************ 00:05:49.617 START TEST locking_overlapped_coremask 00:05:49.617 ************************************ 00:05:49.617 10:24:11 -- common/autotest_common.sh@1111 -- # locking_overlapped_coremask 00:05:49.617 10:24:11 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=192955 00:05:49.617 10:24:11 -- event/cpu_locks.sh@133 -- # waitforlisten 192955 /var/tmp/spdk.sock 00:05:49.617 10:24:11 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:05:49.618 10:24:11 -- common/autotest_common.sh@817 -- # '[' -z 192955 ']' 00:05:49.618 10:24:11 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:49.618 10:24:11 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:49.618 10:24:11 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:49.618 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:49.618 10:24:11 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:49.618 10:24:11 -- common/autotest_common.sh@10 -- # set +x 00:05:49.618 [2024-04-19 10:24:11.614050] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:49.618 [2024-04-19 10:24:11.614125] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192955 ] 00:05:49.618 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.618 [2024-04-19 10:24:11.682709] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:49.877 [2024-04-19 10:24:11.770453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:49.877 [2024-04-19 10:24:11.770539] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:49.877 [2024-04-19 10:24:11.770541] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.445 10:24:12 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:50.445 10:24:12 -- common/autotest_common.sh@850 -- # return 0 00:05:50.445 10:24:12 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=193116 00:05:50.445 10:24:12 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 193116 /var/tmp/spdk2.sock 00:05:50.445 10:24:12 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:05:50.445 10:24:12 -- common/autotest_common.sh@638 -- # local es=0 00:05:50.445 10:24:12 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 193116 /var/tmp/spdk2.sock 00:05:50.445 10:24:12 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:05:50.445 10:24:12 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:50.445 10:24:12 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:05:50.445 10:24:12 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:50.445 10:24:12 -- common/autotest_common.sh@641 -- # waitforlisten 193116 /var/tmp/spdk2.sock 00:05:50.446 10:24:12 -- common/autotest_common.sh@817 -- # '[' -z 193116 ']' 00:05:50.446 10:24:12 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:50.446 10:24:12 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:50.446 10:24:12 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:50.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:50.446 10:24:12 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:50.446 10:24:12 -- common/autotest_common.sh@10 -- # set +x 00:05:50.446 [2024-04-19 10:24:12.468247] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:50.446 [2024-04-19 10:24:12.468316] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193116 ] 00:05:50.446 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.705 [2024-04-19 10:24:12.564695] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 192955 has claimed it. 00:05:50.705 [2024-04-19 10:24:12.564735] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:51.281 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (193116) - No such process 00:05:51.281 ERROR: process (pid: 193116) is no longer running 00:05:51.281 10:24:13 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:51.281 10:24:13 -- common/autotest_common.sh@850 -- # return 1 00:05:51.281 10:24:13 -- common/autotest_common.sh@641 -- # es=1 00:05:51.281 10:24:13 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:51.281 10:24:13 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:05:51.281 10:24:13 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:51.281 10:24:13 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:05:51.281 10:24:13 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:51.281 10:24:13 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:51.281 10:24:13 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:51.281 10:24:13 -- event/cpu_locks.sh@141 -- # killprocess 192955 00:05:51.281 10:24:13 -- common/autotest_common.sh@936 -- # '[' -z 192955 ']' 00:05:51.281 10:24:13 -- common/autotest_common.sh@940 -- # kill -0 192955 00:05:51.282 10:24:13 -- common/autotest_common.sh@941 -- # uname 00:05:51.282 10:24:13 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:51.282 10:24:13 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 192955 00:05:51.282 10:24:13 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:51.282 10:24:13 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:51.282 10:24:13 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 192955' 00:05:51.282 killing process with pid 192955 00:05:51.282 10:24:13 -- common/autotest_common.sh@955 -- # kill 192955 00:05:51.282 10:24:13 -- common/autotest_common.sh@960 -- # wait 192955 00:05:51.541 00:05:51.541 real 0m1.887s 00:05:51.541 user 0m5.296s 00:05:51.541 sys 0m0.451s 00:05:51.541 10:24:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:51.541 10:24:13 -- common/autotest_common.sh@10 -- # set +x 00:05:51.541 ************************************ 00:05:51.541 END TEST locking_overlapped_coremask 00:05:51.541 ************************************ 00:05:51.541 10:24:13 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:05:51.541 10:24:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.541 10:24:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.541 10:24:13 -- common/autotest_common.sh@10 -- # set +x 00:05:51.541 ************************************ 00:05:51.541 START TEST locking_overlapped_coremask_via_rpc 00:05:51.541 ************************************ 00:05:51.541 10:24:13 -- common/autotest_common.sh@1111 -- # locking_overlapped_coremask_via_rpc 00:05:51.541 10:24:13 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=193259 00:05:51.541 10:24:13 -- event/cpu_locks.sh@149 -- # waitforlisten 193259 /var/tmp/spdk.sock 00:05:51.541 10:24:13 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:05:51.541 10:24:13 -- common/autotest_common.sh@817 -- # '[' -z 193259 ']' 00:05:51.541 10:24:13 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:51.541 10:24:13 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:51.541 10:24:13 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:51.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:51.541 10:24:13 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:51.541 10:24:13 -- common/autotest_common.sh@10 -- # set +x 00:05:51.802 [2024-04-19 10:24:13.662216] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:51.802 [2024-04-19 10:24:13.662302] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193259 ] 00:05:51.802 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.802 [2024-04-19 10:24:13.731258] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:51.802 [2024-04-19 10:24:13.731285] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:51.802 [2024-04-19 10:24:13.817500] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:51.802 [2024-04-19 10:24:13.817584] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:51.802 [2024-04-19 10:24:13.817586] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.372 10:24:14 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:52.372 10:24:14 -- common/autotest_common.sh@850 -- # return 0 00:05:52.372 10:24:14 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=193355 00:05:52.372 10:24:14 -- event/cpu_locks.sh@153 -- # waitforlisten 193355 /var/tmp/spdk2.sock 00:05:52.372 10:24:14 -- common/autotest_common.sh@817 -- # '[' -z 193355 ']' 00:05:52.372 10:24:14 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:52.372 10:24:14 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:52.372 10:24:14 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:52.372 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:52.372 10:24:14 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:52.372 10:24:14 -- common/autotest_common.sh@10 -- # set +x 00:05:52.372 10:24:14 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:05:52.633 [2024-04-19 10:24:14.505125] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:52.633 [2024-04-19 10:24:14.505214] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193355 ] 00:05:52.633 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.633 [2024-04-19 10:24:14.600745] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:52.633 [2024-04-19 10:24:14.600772] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:52.893 [2024-04-19 10:24:14.760225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:52.893 [2024-04-19 10:24:14.760334] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:52.893 [2024-04-19 10:24:14.760336] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:05:53.461 10:24:15 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:53.461 10:24:15 -- common/autotest_common.sh@850 -- # return 0 00:05:53.462 10:24:15 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:05:53.462 10:24:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:53.462 10:24:15 -- common/autotest_common.sh@10 -- # set +x 00:05:53.462 10:24:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:53.462 10:24:15 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:53.462 10:24:15 -- common/autotest_common.sh@638 -- # local es=0 00:05:53.462 10:24:15 -- common/autotest_common.sh@640 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:53.462 10:24:15 -- common/autotest_common.sh@626 -- # local arg=rpc_cmd 00:05:53.462 10:24:15 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:53.462 10:24:15 -- common/autotest_common.sh@630 -- # type -t rpc_cmd 00:05:53.462 10:24:15 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:53.462 10:24:15 -- common/autotest_common.sh@641 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:53.462 10:24:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:53.462 10:24:15 -- common/autotest_common.sh@10 -- # set +x 00:05:53.462 [2024-04-19 10:24:15.333876] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 193259 has claimed it. 00:05:53.462 request: 00:05:53.462 { 00:05:53.462 "method": "framework_enable_cpumask_locks", 00:05:53.462 "req_id": 1 00:05:53.462 } 00:05:53.462 Got JSON-RPC error response 00:05:53.462 response: 00:05:53.462 { 00:05:53.462 "code": -32603, 00:05:53.462 "message": "Failed to claim CPU core: 2" 00:05:53.462 } 00:05:53.462 10:24:15 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:05:53.462 10:24:15 -- common/autotest_common.sh@641 -- # es=1 00:05:53.462 10:24:15 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:53.462 10:24:15 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:05:53.462 10:24:15 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:53.462 10:24:15 -- event/cpu_locks.sh@158 -- # waitforlisten 193259 /var/tmp/spdk.sock 00:05:53.462 10:24:15 -- common/autotest_common.sh@817 -- # '[' -z 193259 ']' 00:05:53.462 10:24:15 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:53.462 10:24:15 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:53.462 10:24:15 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:53.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:53.462 10:24:15 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:53.462 10:24:15 -- common/autotest_common.sh@10 -- # set +x 00:05:53.462 10:24:15 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:53.462 10:24:15 -- common/autotest_common.sh@850 -- # return 0 00:05:53.462 10:24:15 -- event/cpu_locks.sh@159 -- # waitforlisten 193355 /var/tmp/spdk2.sock 00:05:53.462 10:24:15 -- common/autotest_common.sh@817 -- # '[' -z 193355 ']' 00:05:53.462 10:24:15 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:53.462 10:24:15 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:53.462 10:24:15 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:53.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:53.462 10:24:15 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:53.462 10:24:15 -- common/autotest_common.sh@10 -- # set +x 00:05:53.721 10:24:15 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:53.721 10:24:15 -- common/autotest_common.sh@850 -- # return 0 00:05:53.721 10:24:15 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:05:53.721 10:24:15 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:53.721 10:24:15 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:53.721 10:24:15 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:53.721 00:05:53.721 real 0m2.067s 00:05:53.721 user 0m0.795s 00:05:53.721 sys 0m0.210s 00:05:53.721 10:24:15 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:53.721 10:24:15 -- common/autotest_common.sh@10 -- # set +x 00:05:53.721 ************************************ 00:05:53.721 END TEST locking_overlapped_coremask_via_rpc 00:05:53.721 ************************************ 00:05:53.721 10:24:15 -- event/cpu_locks.sh@174 -- # cleanup 00:05:53.721 10:24:15 -- event/cpu_locks.sh@15 -- # [[ -z 193259 ]] 00:05:53.721 10:24:15 -- event/cpu_locks.sh@15 -- # killprocess 193259 00:05:53.721 10:24:15 -- common/autotest_common.sh@936 -- # '[' -z 193259 ']' 00:05:53.721 10:24:15 -- common/autotest_common.sh@940 -- # kill -0 193259 00:05:53.721 10:24:15 -- common/autotest_common.sh@941 -- # uname 00:05:53.721 10:24:15 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:53.721 10:24:15 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 193259 00:05:53.721 10:24:15 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:53.721 10:24:15 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:53.721 10:24:15 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 193259' 00:05:53.721 killing process with pid 193259 00:05:53.721 10:24:15 -- common/autotest_common.sh@955 -- # kill 193259 00:05:53.721 10:24:15 -- common/autotest_common.sh@960 -- # wait 193259 00:05:54.291 10:24:16 -- event/cpu_locks.sh@16 -- # [[ -z 193355 ]] 00:05:54.291 10:24:16 -- event/cpu_locks.sh@16 -- # killprocess 193355 00:05:54.291 10:24:16 -- common/autotest_common.sh@936 -- # '[' -z 193355 ']' 00:05:54.291 10:24:16 -- common/autotest_common.sh@940 -- # kill -0 193355 00:05:54.291 10:24:16 -- common/autotest_common.sh@941 -- # uname 00:05:54.291 10:24:16 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:54.291 10:24:16 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 193355 00:05:54.291 10:24:16 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:54.291 10:24:16 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:54.291 10:24:16 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 193355' 00:05:54.291 killing process with pid 193355 00:05:54.291 10:24:16 -- common/autotest_common.sh@955 -- # kill 193355 00:05:54.291 10:24:16 -- common/autotest_common.sh@960 -- # wait 193355 00:05:54.550 10:24:16 -- event/cpu_locks.sh@18 -- # rm -f 00:05:54.550 10:24:16 -- event/cpu_locks.sh@1 -- # cleanup 00:05:54.550 10:24:16 -- event/cpu_locks.sh@15 -- # [[ -z 193259 ]] 00:05:54.550 10:24:16 -- event/cpu_locks.sh@15 -- # killprocess 193259 00:05:54.550 10:24:16 -- common/autotest_common.sh@936 -- # '[' -z 193259 ']' 00:05:54.550 10:24:16 -- common/autotest_common.sh@940 -- # kill -0 193259 00:05:54.550 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (193259) - No such process 00:05:54.550 10:24:16 -- common/autotest_common.sh@963 -- # echo 'Process with pid 193259 is not found' 00:05:54.550 Process with pid 193259 is not found 00:05:54.550 10:24:16 -- event/cpu_locks.sh@16 -- # [[ -z 193355 ]] 00:05:54.550 10:24:16 -- event/cpu_locks.sh@16 -- # killprocess 193355 00:05:54.550 10:24:16 -- common/autotest_common.sh@936 -- # '[' -z 193355 ']' 00:05:54.550 10:24:16 -- common/autotest_common.sh@940 -- # kill -0 193355 00:05:54.550 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (193355) - No such process 00:05:54.550 10:24:16 -- common/autotest_common.sh@963 -- # echo 'Process with pid 193355 is not found' 00:05:54.550 Process with pid 193355 is not found 00:05:54.550 10:24:16 -- event/cpu_locks.sh@18 -- # rm -f 00:05:54.550 00:05:54.550 real 0m19.428s 00:05:54.550 user 0m31.108s 00:05:54.550 sys 0m6.479s 00:05:54.550 10:24:16 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:54.550 10:24:16 -- common/autotest_common.sh@10 -- # set +x 00:05:54.550 ************************************ 00:05:54.550 END TEST cpu_locks 00:05:54.550 ************************************ 00:05:54.550 00:05:54.550 real 0m45.673s 00:05:54.550 user 1m22.825s 00:05:54.550 sys 0m11.028s 00:05:54.550 10:24:16 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:54.550 10:24:16 -- common/autotest_common.sh@10 -- # set +x 00:05:54.550 ************************************ 00:05:54.550 END TEST event 00:05:54.550 ************************************ 00:05:54.550 10:24:16 -- spdk/autotest.sh@178 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:54.550 10:24:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:54.550 10:24:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.550 10:24:16 -- common/autotest_common.sh@10 -- # set +x 00:05:54.550 ************************************ 00:05:54.550 START TEST thread 00:05:54.550 ************************************ 00:05:54.550 10:24:16 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:54.810 * Looking for test storage... 00:05:54.810 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:05:54.810 10:24:16 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:54.810 10:24:16 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:05:54.810 10:24:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.810 10:24:16 -- common/autotest_common.sh@10 -- # set +x 00:05:54.810 ************************************ 00:05:54.810 START TEST thread_poller_perf 00:05:54.810 ************************************ 00:05:54.810 10:24:16 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:54.810 [2024-04-19 10:24:16.894043] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:54.810 [2024-04-19 10:24:16.894145] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193801 ] 00:05:55.070 EAL: No free 2048 kB hugepages reported on node 1 00:05:55.070 [2024-04-19 10:24:16.967853] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.070 [2024-04-19 10:24:17.043867] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.070 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:05:56.007 ====================================== 00:05:56.007 busy:2304267014 (cyc) 00:05:56.007 total_run_count: 841000 00:05:56.007 tsc_hz: 2300000000 (cyc) 00:05:56.007 ====================================== 00:05:56.007 poller_cost: 2739 (cyc), 1190 (nsec) 00:05:56.007 00:05:56.007 real 0m1.238s 00:05:56.007 user 0m1.143s 00:05:56.007 sys 0m0.091s 00:05:56.007 10:24:18 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:56.007 10:24:18 -- common/autotest_common.sh@10 -- # set +x 00:05:56.007 ************************************ 00:05:56.007 END TEST thread_poller_perf 00:05:56.007 ************************************ 00:05:56.267 10:24:18 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:56.267 10:24:18 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:05:56.267 10:24:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.267 10:24:18 -- common/autotest_common.sh@10 -- # set +x 00:05:56.267 ************************************ 00:05:56.267 START TEST thread_poller_perf 00:05:56.267 ************************************ 00:05:56.267 10:24:18 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:56.267 [2024-04-19 10:24:18.288516] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:56.267 [2024-04-19 10:24:18.288606] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193994 ] 00:05:56.267 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.267 [2024-04-19 10:24:18.360844] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.527 [2024-04-19 10:24:18.441315] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.527 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:05:57.464 ====================================== 00:05:57.464 busy:2301245038 (cyc) 00:05:57.464 total_run_count: 13299000 00:05:57.464 tsc_hz: 2300000000 (cyc) 00:05:57.464 ====================================== 00:05:57.464 poller_cost: 173 (cyc), 75 (nsec) 00:05:57.464 00:05:57.464 real 0m1.237s 00:05:57.464 user 0m1.137s 00:05:57.464 sys 0m0.095s 00:05:57.464 10:24:19 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:57.464 10:24:19 -- common/autotest_common.sh@10 -- # set +x 00:05:57.464 ************************************ 00:05:57.464 END TEST thread_poller_perf 00:05:57.464 ************************************ 00:05:57.464 10:24:19 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:05:57.464 10:24:19 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:57.464 10:24:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:57.464 10:24:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:57.464 10:24:19 -- common/autotest_common.sh@10 -- # set +x 00:05:57.723 ************************************ 00:05:57.723 START TEST thread_spdk_lock 00:05:57.723 ************************************ 00:05:57.723 10:24:19 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:57.723 [2024-04-19 10:24:19.668804] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:57.723 [2024-04-19 10:24:19.668893] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194195 ] 00:05:57.723 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.723 [2024-04-19 10:24:19.740884] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:57.723 [2024-04-19 10:24:19.817858] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:57.723 [2024-04-19 10:24:19.817861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.291 [2024-04-19 10:24:20.306266] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:58.291 [2024-04-19 10:24:20.306306] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:05:58.291 [2024-04-19 10:24:20.306317] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x14b1a00 00:05:58.291 [2024-04-19 10:24:20.307291] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:58.291 [2024-04-19 10:24:20.307395] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:58.291 [2024-04-19 10:24:20.307415] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:58.291 Starting test contend 00:05:58.291 Worker Delay Wait us Hold us Total us 00:05:58.291 0 3 164901 184937 349838 00:05:58.291 1 5 82029 286021 368050 00:05:58.291 PASS test contend 00:05:58.291 Starting test hold_by_poller 00:05:58.291 PASS test hold_by_poller 00:05:58.291 Starting test hold_by_message 00:05:58.291 PASS test hold_by_message 00:05:58.291 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:05:58.291 100014 assertions passed 00:05:58.291 0 assertions failed 00:05:58.291 00:05:58.291 real 0m0.721s 00:05:58.291 user 0m1.118s 00:05:58.291 sys 0m0.088s 00:05:58.291 10:24:20 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:58.291 10:24:20 -- common/autotest_common.sh@10 -- # set +x 00:05:58.291 ************************************ 00:05:58.291 END TEST thread_spdk_lock 00:05:58.291 ************************************ 00:05:58.551 00:05:58.551 real 0m3.752s 00:05:58.551 user 0m3.593s 00:05:58.551 sys 0m0.625s 00:05:58.551 10:24:20 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:58.551 10:24:20 -- common/autotest_common.sh@10 -- # set +x 00:05:58.551 ************************************ 00:05:58.551 END TEST thread 00:05:58.551 ************************************ 00:05:58.551 10:24:20 -- spdk/autotest.sh@179 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:58.551 10:24:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:58.551 10:24:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:58.551 10:24:20 -- common/autotest_common.sh@10 -- # set +x 00:05:58.551 ************************************ 00:05:58.551 START TEST accel 00:05:58.551 ************************************ 00:05:58.551 10:24:20 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:58.810 * Looking for test storage... 00:05:58.810 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:05:58.810 10:24:20 -- accel/accel.sh@81 -- # declare -A expected_opcs 00:05:58.810 10:24:20 -- accel/accel.sh@82 -- # get_expected_opcs 00:05:58.811 10:24:20 -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:58.811 10:24:20 -- accel/accel.sh@62 -- # spdk_tgt_pid=194429 00:05:58.811 10:24:20 -- accel/accel.sh@63 -- # waitforlisten 194429 00:05:58.811 10:24:20 -- common/autotest_common.sh@817 -- # '[' -z 194429 ']' 00:05:58.811 10:24:20 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:58.811 10:24:20 -- accel/accel.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:05:58.811 10:24:20 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:58.811 10:24:20 -- accel/accel.sh@61 -- # build_accel_config 00:05:58.811 10:24:20 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:58.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:58.811 10:24:20 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:58.811 10:24:20 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:58.811 10:24:20 -- common/autotest_common.sh@10 -- # set +x 00:05:58.811 10:24:20 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:58.811 10:24:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:58.811 10:24:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:58.811 10:24:20 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:58.811 10:24:20 -- accel/accel.sh@40 -- # local IFS=, 00:05:58.811 10:24:20 -- accel/accel.sh@41 -- # jq -r . 00:05:58.811 [2024-04-19 10:24:20.700349] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:05:58.811 [2024-04-19 10:24:20.700410] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194429 ] 00:05:58.811 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.811 [2024-04-19 10:24:20.771149] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.811 [2024-04-19 10:24:20.847887] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.750 10:24:21 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:59.750 10:24:21 -- common/autotest_common.sh@850 -- # return 0 00:05:59.750 10:24:21 -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:05:59.750 10:24:21 -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:05:59.750 10:24:21 -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:05:59.750 10:24:21 -- accel/accel.sh@68 -- # [[ -n '' ]] 00:05:59.750 10:24:21 -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:05:59.750 10:24:21 -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:05:59.750 10:24:21 -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:05:59.750 10:24:21 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:59.750 10:24:21 -- common/autotest_common.sh@10 -- # set +x 00:05:59.750 10:24:21 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # IFS== 00:05:59.750 10:24:21 -- accel/accel.sh@72 -- # read -r opc module 00:05:59.750 10:24:21 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:59.750 10:24:21 -- accel/accel.sh@75 -- # killprocess 194429 00:05:59.750 10:24:21 -- common/autotest_common.sh@936 -- # '[' -z 194429 ']' 00:05:59.750 10:24:21 -- common/autotest_common.sh@940 -- # kill -0 194429 00:05:59.750 10:24:21 -- common/autotest_common.sh@941 -- # uname 00:05:59.750 10:24:21 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:59.750 10:24:21 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 194429 00:05:59.750 10:24:21 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:59.750 10:24:21 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:59.750 10:24:21 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 194429' 00:05:59.750 killing process with pid 194429 00:05:59.750 10:24:21 -- common/autotest_common.sh@955 -- # kill 194429 00:05:59.750 10:24:21 -- common/autotest_common.sh@960 -- # wait 194429 00:06:00.010 10:24:21 -- accel/accel.sh@76 -- # trap - ERR 00:06:00.010 10:24:21 -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:00.010 10:24:21 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:00.010 10:24:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.010 10:24:21 -- common/autotest_common.sh@10 -- # set +x 00:06:00.010 10:24:22 -- common/autotest_common.sh@1111 -- # accel_perf -h 00:06:00.010 10:24:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:00.010 10:24:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:00.010 10:24:22 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:00.010 10:24:22 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:00.010 10:24:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:00.010 10:24:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:00.010 10:24:22 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:00.010 10:24:22 -- accel/accel.sh@40 -- # local IFS=, 00:06:00.010 10:24:22 -- accel/accel.sh@41 -- # jq -r . 00:06:00.010 10:24:22 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:00.010 10:24:22 -- common/autotest_common.sh@10 -- # set +x 00:06:00.010 10:24:22 -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:00.010 10:24:22 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:00.010 10:24:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.010 10:24:22 -- common/autotest_common.sh@10 -- # set +x 00:06:00.270 ************************************ 00:06:00.270 START TEST accel_missing_filename 00:06:00.270 ************************************ 00:06:00.270 10:24:22 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w compress 00:06:00.270 10:24:22 -- common/autotest_common.sh@638 -- # local es=0 00:06:00.270 10:24:22 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:00.270 10:24:22 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:00.270 10:24:22 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:00.270 10:24:22 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:00.270 10:24:22 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:00.270 10:24:22 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w compress 00:06:00.270 10:24:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:00.270 10:24:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:00.270 10:24:22 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:00.270 10:24:22 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:00.270 10:24:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:00.270 10:24:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:00.270 10:24:22 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:00.270 10:24:22 -- accel/accel.sh@40 -- # local IFS=, 00:06:00.270 10:24:22 -- accel/accel.sh@41 -- # jq -r . 00:06:00.270 [2024-04-19 10:24:22.250030] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:00.270 [2024-04-19 10:24:22.250115] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194648 ] 00:06:00.270 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.270 [2024-04-19 10:24:22.323803] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:00.530 [2024-04-19 10:24:22.402532] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.530 [2024-04-19 10:24:22.441658] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:00.530 [2024-04-19 10:24:22.500418] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:06:00.530 A filename is required. 00:06:00.530 10:24:22 -- common/autotest_common.sh@641 -- # es=234 00:06:00.530 10:24:22 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:00.530 10:24:22 -- common/autotest_common.sh@650 -- # es=106 00:06:00.530 10:24:22 -- common/autotest_common.sh@651 -- # case "$es" in 00:06:00.530 10:24:22 -- common/autotest_common.sh@658 -- # es=1 00:06:00.530 10:24:22 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:00.530 00:06:00.530 real 0m0.342s 00:06:00.530 user 0m0.231s 00:06:00.530 sys 0m0.136s 00:06:00.530 10:24:22 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:00.530 10:24:22 -- common/autotest_common.sh@10 -- # set +x 00:06:00.530 ************************************ 00:06:00.530 END TEST accel_missing_filename 00:06:00.530 ************************************ 00:06:00.530 10:24:22 -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:00.530 10:24:22 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:00.530 10:24:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.530 10:24:22 -- common/autotest_common.sh@10 -- # set +x 00:06:00.790 ************************************ 00:06:00.790 START TEST accel_compress_verify 00:06:00.790 ************************************ 00:06:00.790 10:24:22 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:00.790 10:24:22 -- common/autotest_common.sh@638 -- # local es=0 00:06:00.790 10:24:22 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:00.790 10:24:22 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:00.790 10:24:22 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:00.790 10:24:22 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:00.790 10:24:22 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:00.790 10:24:22 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:00.790 10:24:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:00.790 10:24:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:00.790 10:24:22 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:00.790 10:24:22 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:00.790 10:24:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:00.790 10:24:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:00.790 10:24:22 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:00.790 10:24:22 -- accel/accel.sh@40 -- # local IFS=, 00:06:00.790 10:24:22 -- accel/accel.sh@41 -- # jq -r . 00:06:00.790 [2024-04-19 10:24:22.748310] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:00.790 [2024-04-19 10:24:22.748396] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194685 ] 00:06:00.790 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.790 [2024-04-19 10:24:22.826103] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.050 [2024-04-19 10:24:22.908378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.050 [2024-04-19 10:24:22.948306] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:01.050 [2024-04-19 10:24:23.008008] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:06:01.050 00:06:01.050 Compression does not support the verify option, aborting. 00:06:01.050 10:24:23 -- common/autotest_common.sh@641 -- # es=161 00:06:01.050 10:24:23 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:01.050 10:24:23 -- common/autotest_common.sh@650 -- # es=33 00:06:01.050 10:24:23 -- common/autotest_common.sh@651 -- # case "$es" in 00:06:01.050 10:24:23 -- common/autotest_common.sh@658 -- # es=1 00:06:01.050 10:24:23 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:01.050 00:06:01.050 real 0m0.353s 00:06:01.050 user 0m0.257s 00:06:01.050 sys 0m0.135s 00:06:01.050 10:24:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:01.050 10:24:23 -- common/autotest_common.sh@10 -- # set +x 00:06:01.050 ************************************ 00:06:01.050 END TEST accel_compress_verify 00:06:01.050 ************************************ 00:06:01.050 10:24:23 -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:01.050 10:24:23 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:01.050 10:24:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:01.050 10:24:23 -- common/autotest_common.sh@10 -- # set +x 00:06:01.308 ************************************ 00:06:01.308 START TEST accel_wrong_workload 00:06:01.308 ************************************ 00:06:01.308 10:24:23 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w foobar 00:06:01.308 10:24:23 -- common/autotest_common.sh@638 -- # local es=0 00:06:01.308 10:24:23 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:01.308 10:24:23 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:01.308 10:24:23 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:01.308 10:24:23 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:01.308 10:24:23 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:01.308 10:24:23 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w foobar 00:06:01.308 10:24:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:01.308 10:24:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.308 10:24:23 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:01.308 10:24:23 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:01.308 10:24:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.308 10:24:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.308 10:24:23 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:01.308 10:24:23 -- accel/accel.sh@40 -- # local IFS=, 00:06:01.308 10:24:23 -- accel/accel.sh@41 -- # jq -r . 00:06:01.308 Unsupported workload type: foobar 00:06:01.308 [2024-04-19 10:24:23.255061] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:01.308 accel_perf options: 00:06:01.308 [-h help message] 00:06:01.308 [-q queue depth per core] 00:06:01.308 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:01.308 [-T number of threads per core 00:06:01.308 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:01.308 [-t time in seconds] 00:06:01.308 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:01.308 [ dif_verify, , dif_generate, dif_generate_copy 00:06:01.308 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:01.308 [-l for compress/decompress workloads, name of uncompressed input file 00:06:01.308 [-S for crc32c workload, use this seed value (default 0) 00:06:01.308 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:01.308 [-f for fill workload, use this BYTE value (default 255) 00:06:01.308 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:01.308 [-y verify result if this switch is on] 00:06:01.308 [-a tasks to allocate per core (default: same value as -q)] 00:06:01.308 Can be used to spread operations across a wider range of memory. 00:06:01.308 10:24:23 -- common/autotest_common.sh@641 -- # es=1 00:06:01.308 10:24:23 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:01.308 10:24:23 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:01.308 10:24:23 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:01.308 00:06:01.308 real 0m0.029s 00:06:01.308 user 0m0.012s 00:06:01.308 sys 0m0.016s 00:06:01.308 10:24:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:01.308 10:24:23 -- common/autotest_common.sh@10 -- # set +x 00:06:01.308 ************************************ 00:06:01.308 END TEST accel_wrong_workload 00:06:01.308 ************************************ 00:06:01.308 Error: writing output failed: Broken pipe 00:06:01.308 10:24:23 -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:01.308 10:24:23 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:01.308 10:24:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:01.308 10:24:23 -- common/autotest_common.sh@10 -- # set +x 00:06:01.567 ************************************ 00:06:01.567 START TEST accel_negative_buffers 00:06:01.567 ************************************ 00:06:01.567 10:24:23 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:01.567 10:24:23 -- common/autotest_common.sh@638 -- # local es=0 00:06:01.567 10:24:23 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:01.567 10:24:23 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:01.567 10:24:23 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:01.567 10:24:23 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:01.567 10:24:23 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:01.567 10:24:23 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w xor -y -x -1 00:06:01.567 10:24:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:01.567 10:24:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.567 10:24:23 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:01.567 10:24:23 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:01.567 10:24:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.567 10:24:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.567 10:24:23 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:01.567 10:24:23 -- accel/accel.sh@40 -- # local IFS=, 00:06:01.567 10:24:23 -- accel/accel.sh@41 -- # jq -r . 00:06:01.567 -x option must be non-negative. 00:06:01.567 [2024-04-19 10:24:23.444249] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:01.567 accel_perf options: 00:06:01.567 [-h help message] 00:06:01.567 [-q queue depth per core] 00:06:01.567 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:01.567 [-T number of threads per core 00:06:01.567 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:01.567 [-t time in seconds] 00:06:01.567 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:01.567 [ dif_verify, , dif_generate, dif_generate_copy 00:06:01.567 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:01.567 [-l for compress/decompress workloads, name of uncompressed input file 00:06:01.567 [-S for crc32c workload, use this seed value (default 0) 00:06:01.567 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:01.567 [-f for fill workload, use this BYTE value (default 255) 00:06:01.567 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:01.567 [-y verify result if this switch is on] 00:06:01.567 [-a tasks to allocate per core (default: same value as -q)] 00:06:01.567 Can be used to spread operations across a wider range of memory. 00:06:01.567 10:24:23 -- common/autotest_common.sh@641 -- # es=1 00:06:01.567 10:24:23 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:01.567 10:24:23 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:01.567 10:24:23 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:01.567 00:06:01.567 real 0m0.030s 00:06:01.567 user 0m0.015s 00:06:01.567 sys 0m0.015s 00:06:01.567 10:24:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:01.567 10:24:23 -- common/autotest_common.sh@10 -- # set +x 00:06:01.567 ************************************ 00:06:01.567 END TEST accel_negative_buffers 00:06:01.567 ************************************ 00:06:01.567 Error: writing output failed: Broken pipe 00:06:01.567 10:24:23 -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:01.567 10:24:23 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:01.567 10:24:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:01.567 10:24:23 -- common/autotest_common.sh@10 -- # set +x 00:06:01.567 ************************************ 00:06:01.567 START TEST accel_crc32c 00:06:01.567 ************************************ 00:06:01.567 10:24:23 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:01.567 10:24:23 -- accel/accel.sh@16 -- # local accel_opc 00:06:01.567 10:24:23 -- accel/accel.sh@17 -- # local accel_module 00:06:01.567 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.567 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.567 10:24:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:01.567 10:24:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:01.567 10:24:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.567 10:24:23 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:01.567 10:24:23 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:01.567 10:24:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.567 10:24:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.567 10:24:23 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:01.567 10:24:23 -- accel/accel.sh@40 -- # local IFS=, 00:06:01.567 10:24:23 -- accel/accel.sh@41 -- # jq -r . 00:06:01.567 [2024-04-19 10:24:23.641604] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:01.567 [2024-04-19 10:24:23.641679] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194923 ] 00:06:01.827 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.827 [2024-04-19 10:24:23.714402] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.827 [2024-04-19 10:24:23.791052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val= 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val= 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val=0x1 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val= 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val= 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val=crc32c 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val=32 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val= 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val=software 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@22 -- # accel_module=software 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val=32 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val=32 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val=1 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val=Yes 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val= 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:01.827 10:24:23 -- accel/accel.sh@20 -- # val= 00:06:01.827 10:24:23 -- accel/accel.sh@21 -- # case "$var" in 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # IFS=: 00:06:01.827 10:24:23 -- accel/accel.sh@19 -- # read -r var val 00:06:03.207 10:24:24 -- accel/accel.sh@20 -- # val= 00:06:03.207 10:24:24 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # IFS=: 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # read -r var val 00:06:03.207 10:24:24 -- accel/accel.sh@20 -- # val= 00:06:03.207 10:24:24 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # IFS=: 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # read -r var val 00:06:03.207 10:24:24 -- accel/accel.sh@20 -- # val= 00:06:03.207 10:24:24 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # IFS=: 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # read -r var val 00:06:03.207 10:24:24 -- accel/accel.sh@20 -- # val= 00:06:03.207 10:24:24 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # IFS=: 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # read -r var val 00:06:03.207 10:24:24 -- accel/accel.sh@20 -- # val= 00:06:03.207 10:24:24 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # IFS=: 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # read -r var val 00:06:03.207 10:24:24 -- accel/accel.sh@20 -- # val= 00:06:03.207 10:24:24 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # IFS=: 00:06:03.207 10:24:24 -- accel/accel.sh@19 -- # read -r var val 00:06:03.207 10:24:24 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:03.207 10:24:24 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:03.207 10:24:24 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:03.207 00:06:03.207 real 0m1.347s 00:06:03.207 user 0m1.229s 00:06:03.207 sys 0m0.130s 00:06:03.207 10:24:24 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:03.207 10:24:24 -- common/autotest_common.sh@10 -- # set +x 00:06:03.207 ************************************ 00:06:03.207 END TEST accel_crc32c 00:06:03.207 ************************************ 00:06:03.207 10:24:25 -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:03.207 10:24:25 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:03.207 10:24:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:03.207 10:24:25 -- common/autotest_common.sh@10 -- # set +x 00:06:03.207 ************************************ 00:06:03.207 START TEST accel_crc32c_C2 00:06:03.207 ************************************ 00:06:03.207 10:24:25 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:03.207 10:24:25 -- accel/accel.sh@16 -- # local accel_opc 00:06:03.207 10:24:25 -- accel/accel.sh@17 -- # local accel_module 00:06:03.207 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.207 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.207 10:24:25 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:03.207 10:24:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:03.207 10:24:25 -- accel/accel.sh@12 -- # build_accel_config 00:06:03.207 10:24:25 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:03.207 10:24:25 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:03.207 10:24:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:03.207 10:24:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:03.207 10:24:25 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:03.207 10:24:25 -- accel/accel.sh@40 -- # local IFS=, 00:06:03.207 10:24:25 -- accel/accel.sh@41 -- # jq -r . 00:06:03.207 [2024-04-19 10:24:25.143925] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:03.207 [2024-04-19 10:24:25.143997] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195117 ] 00:06:03.207 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.207 [2024-04-19 10:24:25.215248] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.207 [2024-04-19 10:24:25.291579] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val= 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val= 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val=0x1 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val= 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val= 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val=crc32c 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val=0 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val= 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val=software 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@22 -- # accel_module=software 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val=32 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val=32 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val=1 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val=Yes 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val= 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:03.467 10:24:25 -- accel/accel.sh@20 -- # val= 00:06:03.467 10:24:25 -- accel/accel.sh@21 -- # case "$var" in 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # IFS=: 00:06:03.467 10:24:25 -- accel/accel.sh@19 -- # read -r var val 00:06:04.406 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.406 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.406 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.406 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.406 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.406 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.406 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.406 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.406 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.406 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.406 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.406 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.406 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.406 10:24:26 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:04.406 10:24:26 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:04.406 10:24:26 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:04.406 00:06:04.406 real 0m1.346s 00:06:04.406 user 0m1.232s 00:06:04.406 sys 0m0.127s 00:06:04.406 10:24:26 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:04.406 10:24:26 -- common/autotest_common.sh@10 -- # set +x 00:06:04.406 ************************************ 00:06:04.406 END TEST accel_crc32c_C2 00:06:04.406 ************************************ 00:06:04.406 10:24:26 -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:04.406 10:24:26 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:04.406 10:24:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:04.406 10:24:26 -- common/autotest_common.sh@10 -- # set +x 00:06:04.665 ************************************ 00:06:04.665 START TEST accel_copy 00:06:04.665 ************************************ 00:06:04.665 10:24:26 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy -y 00:06:04.665 10:24:26 -- accel/accel.sh@16 -- # local accel_opc 00:06:04.665 10:24:26 -- accel/accel.sh@17 -- # local accel_module 00:06:04.665 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.665 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.665 10:24:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:04.665 10:24:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:04.665 10:24:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:04.665 10:24:26 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:04.665 10:24:26 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:04.665 10:24:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:04.665 10:24:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:04.665 10:24:26 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:04.665 10:24:26 -- accel/accel.sh@40 -- # local IFS=, 00:06:04.665 10:24:26 -- accel/accel.sh@41 -- # jq -r . 00:06:04.665 [2024-04-19 10:24:26.636698] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:04.665 [2024-04-19 10:24:26.636795] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195316 ] 00:06:04.665 EAL: No free 2048 kB hugepages reported on node 1 00:06:04.665 [2024-04-19 10:24:26.706878] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.926 [2024-04-19 10:24:26.784344] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val=0x1 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val=copy 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@23 -- # accel_opc=copy 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val=software 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@22 -- # accel_module=software 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val=32 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val=32 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val=1 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val=Yes 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:04.926 10:24:26 -- accel/accel.sh@20 -- # val= 00:06:04.926 10:24:26 -- accel/accel.sh@21 -- # case "$var" in 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # IFS=: 00:06:04.926 10:24:26 -- accel/accel.sh@19 -- # read -r var val 00:06:05.863 10:24:27 -- accel/accel.sh@20 -- # val= 00:06:05.863 10:24:27 -- accel/accel.sh@21 -- # case "$var" in 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # IFS=: 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # read -r var val 00:06:05.863 10:24:27 -- accel/accel.sh@20 -- # val= 00:06:05.863 10:24:27 -- accel/accel.sh@21 -- # case "$var" in 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # IFS=: 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # read -r var val 00:06:05.863 10:24:27 -- accel/accel.sh@20 -- # val= 00:06:05.863 10:24:27 -- accel/accel.sh@21 -- # case "$var" in 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # IFS=: 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # read -r var val 00:06:05.863 10:24:27 -- accel/accel.sh@20 -- # val= 00:06:05.863 10:24:27 -- accel/accel.sh@21 -- # case "$var" in 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # IFS=: 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # read -r var val 00:06:05.863 10:24:27 -- accel/accel.sh@20 -- # val= 00:06:05.863 10:24:27 -- accel/accel.sh@21 -- # case "$var" in 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # IFS=: 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # read -r var val 00:06:05.863 10:24:27 -- accel/accel.sh@20 -- # val= 00:06:05.863 10:24:27 -- accel/accel.sh@21 -- # case "$var" in 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # IFS=: 00:06:05.863 10:24:27 -- accel/accel.sh@19 -- # read -r var val 00:06:05.863 10:24:27 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:05.863 10:24:27 -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:05.863 10:24:27 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:05.863 00:06:05.863 real 0m1.345s 00:06:05.863 user 0m1.226s 00:06:05.863 sys 0m0.132s 00:06:05.863 10:24:27 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:05.863 10:24:27 -- common/autotest_common.sh@10 -- # set +x 00:06:05.863 ************************************ 00:06:05.863 END TEST accel_copy 00:06:05.863 ************************************ 00:06:06.122 10:24:28 -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:06.122 10:24:28 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:06.122 10:24:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:06.122 10:24:28 -- common/autotest_common.sh@10 -- # set +x 00:06:06.122 ************************************ 00:06:06.122 START TEST accel_fill 00:06:06.122 ************************************ 00:06:06.122 10:24:28 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:06.122 10:24:28 -- accel/accel.sh@16 -- # local accel_opc 00:06:06.122 10:24:28 -- accel/accel.sh@17 -- # local accel_module 00:06:06.122 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.122 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.122 10:24:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:06.122 10:24:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:06.122 10:24:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:06.122 10:24:28 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:06.122 10:24:28 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:06.122 10:24:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:06.122 10:24:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:06.122 10:24:28 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:06.122 10:24:28 -- accel/accel.sh@40 -- # local IFS=, 00:06:06.122 10:24:28 -- accel/accel.sh@41 -- # jq -r . 00:06:06.122 [2024-04-19 10:24:28.147295] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:06.122 [2024-04-19 10:24:28.147384] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195512 ] 00:06:06.122 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.122 [2024-04-19 10:24:28.226647] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.382 [2024-04-19 10:24:28.307189] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val= 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val= 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val=0x1 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val= 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val= 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val=fill 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@23 -- # accel_opc=fill 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val=0x80 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val= 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val=software 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@22 -- # accel_module=software 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val=64 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val=64 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val=1 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val=Yes 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val= 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:06.382 10:24:28 -- accel/accel.sh@20 -- # val= 00:06:06.382 10:24:28 -- accel/accel.sh@21 -- # case "$var" in 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # IFS=: 00:06:06.382 10:24:28 -- accel/accel.sh@19 -- # read -r var val 00:06:07.759 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.759 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.759 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.759 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.759 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.759 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.759 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.759 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.759 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.759 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.759 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.759 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.759 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.759 10:24:29 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:07.759 10:24:29 -- accel/accel.sh@27 -- # [[ -n fill ]] 00:06:07.759 10:24:29 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:07.759 00:06:07.759 real 0m1.358s 00:06:07.759 user 0m1.233s 00:06:07.759 sys 0m0.139s 00:06:07.759 10:24:29 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:07.759 10:24:29 -- common/autotest_common.sh@10 -- # set +x 00:06:07.759 ************************************ 00:06:07.759 END TEST accel_fill 00:06:07.759 ************************************ 00:06:07.759 10:24:29 -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:07.759 10:24:29 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:07.760 10:24:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.760 10:24:29 -- common/autotest_common.sh@10 -- # set +x 00:06:07.760 ************************************ 00:06:07.760 START TEST accel_copy_crc32c 00:06:07.760 ************************************ 00:06:07.760 10:24:29 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy_crc32c -y 00:06:07.760 10:24:29 -- accel/accel.sh@16 -- # local accel_opc 00:06:07.760 10:24:29 -- accel/accel.sh@17 -- # local accel_module 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:07.760 10:24:29 -- accel/accel.sh@12 -- # build_accel_config 00:06:07.760 10:24:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:07.760 10:24:29 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:07.760 10:24:29 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:07.760 10:24:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:07.760 10:24:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:07.760 10:24:29 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:07.760 10:24:29 -- accel/accel.sh@40 -- # local IFS=, 00:06:07.760 10:24:29 -- accel/accel.sh@41 -- # jq -r . 00:06:07.760 [2024-04-19 10:24:29.664742] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:07.760 [2024-04-19 10:24:29.664919] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195815 ] 00:06:07.760 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.760 [2024-04-19 10:24:29.737274] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.760 [2024-04-19 10:24:29.814843] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val=0x1 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val=0 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val=software 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@22 -- # accel_module=software 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val=32 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val=32 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val=1 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:07.760 10:24:29 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:07.760 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:07.760 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:08.019 10:24:29 -- accel/accel.sh@20 -- # val=Yes 00:06:08.019 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.019 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:08.019 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:08.019 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:08.019 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.019 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:08.019 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:08.019 10:24:29 -- accel/accel.sh@20 -- # val= 00:06:08.019 10:24:29 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.019 10:24:29 -- accel/accel.sh@19 -- # IFS=: 00:06:08.019 10:24:29 -- accel/accel.sh@19 -- # read -r var val 00:06:08.955 10:24:30 -- accel/accel.sh@20 -- # val= 00:06:08.955 10:24:30 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # IFS=: 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # read -r var val 00:06:08.955 10:24:30 -- accel/accel.sh@20 -- # val= 00:06:08.955 10:24:30 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # IFS=: 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # read -r var val 00:06:08.955 10:24:30 -- accel/accel.sh@20 -- # val= 00:06:08.955 10:24:30 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # IFS=: 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # read -r var val 00:06:08.955 10:24:30 -- accel/accel.sh@20 -- # val= 00:06:08.955 10:24:30 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # IFS=: 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # read -r var val 00:06:08.955 10:24:30 -- accel/accel.sh@20 -- # val= 00:06:08.955 10:24:30 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # IFS=: 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # read -r var val 00:06:08.955 10:24:30 -- accel/accel.sh@20 -- # val= 00:06:08.955 10:24:30 -- accel/accel.sh@21 -- # case "$var" in 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # IFS=: 00:06:08.955 10:24:30 -- accel/accel.sh@19 -- # read -r var val 00:06:08.955 10:24:30 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:08.955 10:24:30 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:08.955 10:24:30 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:08.955 00:06:08.955 real 0m1.350s 00:06:08.955 user 0m1.236s 00:06:08.955 sys 0m0.129s 00:06:08.955 10:24:30 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:08.955 10:24:30 -- common/autotest_common.sh@10 -- # set +x 00:06:08.955 ************************************ 00:06:08.955 END TEST accel_copy_crc32c 00:06:08.955 ************************************ 00:06:08.955 10:24:31 -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:08.955 10:24:31 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:08.955 10:24:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.955 10:24:31 -- common/autotest_common.sh@10 -- # set +x 00:06:09.214 ************************************ 00:06:09.214 START TEST accel_copy_crc32c_C2 00:06:09.214 ************************************ 00:06:09.214 10:24:31 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:09.214 10:24:31 -- accel/accel.sh@16 -- # local accel_opc 00:06:09.214 10:24:31 -- accel/accel.sh@17 -- # local accel_module 00:06:09.214 10:24:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:09.214 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.214 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.214 10:24:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:09.214 10:24:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:09.214 10:24:31 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:09.214 10:24:31 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:09.214 10:24:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:09.214 10:24:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:09.214 10:24:31 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:09.214 10:24:31 -- accel/accel.sh@40 -- # local IFS=, 00:06:09.214 10:24:31 -- accel/accel.sh@41 -- # jq -r . 00:06:09.214 [2024-04-19 10:24:31.159587] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:09.214 [2024-04-19 10:24:31.159647] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196067 ] 00:06:09.214 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.214 [2024-04-19 10:24:31.226124] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.214 [2024-04-19 10:24:31.301555] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val= 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val= 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val=0x1 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val= 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val= 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val=0 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val='8192 bytes' 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val= 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val=software 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@22 -- # accel_module=software 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val=32 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val=32 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val=1 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val=Yes 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val= 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:09.474 10:24:31 -- accel/accel.sh@20 -- # val= 00:06:09.474 10:24:31 -- accel/accel.sh@21 -- # case "$var" in 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # IFS=: 00:06:09.474 10:24:31 -- accel/accel.sh@19 -- # read -r var val 00:06:10.411 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.411 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.411 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.411 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.411 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.411 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.411 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.411 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.411 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.411 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.411 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.411 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.411 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.411 10:24:32 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:10.411 10:24:32 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:10.411 10:24:32 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:10.411 00:06:10.411 real 0m1.332s 00:06:10.411 user 0m1.217s 00:06:10.411 sys 0m0.129s 00:06:10.411 10:24:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:10.411 10:24:32 -- common/autotest_common.sh@10 -- # set +x 00:06:10.411 ************************************ 00:06:10.411 END TEST accel_copy_crc32c_C2 00:06:10.411 ************************************ 00:06:10.411 10:24:32 -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:10.411 10:24:32 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:10.411 10:24:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:10.411 10:24:32 -- common/autotest_common.sh@10 -- # set +x 00:06:10.670 ************************************ 00:06:10.670 START TEST accel_dualcast 00:06:10.670 ************************************ 00:06:10.670 10:24:32 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dualcast -y 00:06:10.670 10:24:32 -- accel/accel.sh@16 -- # local accel_opc 00:06:10.670 10:24:32 -- accel/accel.sh@17 -- # local accel_module 00:06:10.670 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.670 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.670 10:24:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:10.671 10:24:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:10.671 10:24:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.671 10:24:32 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:10.671 10:24:32 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:10.671 10:24:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.671 10:24:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.671 10:24:32 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:10.671 10:24:32 -- accel/accel.sh@40 -- # local IFS=, 00:06:10.671 10:24:32 -- accel/accel.sh@41 -- # jq -r . 00:06:10.671 [2024-04-19 10:24:32.643858] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:10.671 [2024-04-19 10:24:32.643948] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196260 ] 00:06:10.671 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.671 [2024-04-19 10:24:32.716128] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.930 [2024-04-19 10:24:32.793439] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.930 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.930 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.930 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.930 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.930 10:24:32 -- accel/accel.sh@20 -- # val=0x1 00:06:10.930 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.930 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.930 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.930 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.930 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.930 10:24:32 -- accel/accel.sh@20 -- # val=dualcast 00:06:10.930 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.930 10:24:32 -- accel/accel.sh@23 -- # accel_opc=dualcast 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.930 10:24:32 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:10.930 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.930 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val=software 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@22 -- # accel_module=software 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val=32 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val=32 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val=1 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val=Yes 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:10.931 10:24:32 -- accel/accel.sh@20 -- # val= 00:06:10.931 10:24:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # IFS=: 00:06:10.931 10:24:32 -- accel/accel.sh@19 -- # read -r var val 00:06:11.869 10:24:33 -- accel/accel.sh@20 -- # val= 00:06:11.869 10:24:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # IFS=: 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # read -r var val 00:06:11.869 10:24:33 -- accel/accel.sh@20 -- # val= 00:06:11.869 10:24:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # IFS=: 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # read -r var val 00:06:11.869 10:24:33 -- accel/accel.sh@20 -- # val= 00:06:11.869 10:24:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # IFS=: 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # read -r var val 00:06:11.869 10:24:33 -- accel/accel.sh@20 -- # val= 00:06:11.869 10:24:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # IFS=: 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # read -r var val 00:06:11.869 10:24:33 -- accel/accel.sh@20 -- # val= 00:06:11.869 10:24:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # IFS=: 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # read -r var val 00:06:11.869 10:24:33 -- accel/accel.sh@20 -- # val= 00:06:11.869 10:24:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # IFS=: 00:06:11.869 10:24:33 -- accel/accel.sh@19 -- # read -r var val 00:06:11.869 10:24:33 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:11.869 10:24:33 -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:06:11.869 10:24:33 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:11.869 00:06:11.869 real 0m1.347s 00:06:11.869 user 0m1.230s 00:06:11.869 sys 0m0.130s 00:06:11.869 10:24:33 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:11.869 10:24:33 -- common/autotest_common.sh@10 -- # set +x 00:06:11.869 ************************************ 00:06:11.869 END TEST accel_dualcast 00:06:11.869 ************************************ 00:06:12.128 10:24:34 -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:12.128 10:24:34 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:12.128 10:24:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.128 10:24:34 -- common/autotest_common.sh@10 -- # set +x 00:06:12.128 ************************************ 00:06:12.128 START TEST accel_compare 00:06:12.128 ************************************ 00:06:12.128 10:24:34 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w compare -y 00:06:12.128 10:24:34 -- accel/accel.sh@16 -- # local accel_opc 00:06:12.128 10:24:34 -- accel/accel.sh@17 -- # local accel_module 00:06:12.128 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.128 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.128 10:24:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:12.128 10:24:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:12.128 10:24:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:12.128 10:24:34 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:12.128 10:24:34 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:12.128 10:24:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:12.128 10:24:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:12.128 10:24:34 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:12.128 10:24:34 -- accel/accel.sh@40 -- # local IFS=, 00:06:12.128 10:24:34 -- accel/accel.sh@41 -- # jq -r . 00:06:12.128 [2024-04-19 10:24:34.151397] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:12.128 [2024-04-19 10:24:34.151481] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196457 ] 00:06:12.128 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.128 [2024-04-19 10:24:34.223778] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.388 [2024-04-19 10:24:34.311820] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val= 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val= 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val=0x1 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val= 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val= 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val=compare 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@23 -- # accel_opc=compare 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val= 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val=software 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@22 -- # accel_module=software 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val=32 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val=32 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val=1 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val=Yes 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val= 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:12.388 10:24:34 -- accel/accel.sh@20 -- # val= 00:06:12.388 10:24:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # IFS=: 00:06:12.388 10:24:34 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:13.770 10:24:35 -- accel/accel.sh@27 -- # [[ -n compare ]] 00:06:13.770 10:24:35 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:13.770 00:06:13.770 real 0m1.359s 00:06:13.770 user 0m1.235s 00:06:13.770 sys 0m0.138s 00:06:13.770 10:24:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:13.770 10:24:35 -- common/autotest_common.sh@10 -- # set +x 00:06:13.770 ************************************ 00:06:13.770 END TEST accel_compare 00:06:13.770 ************************************ 00:06:13.770 10:24:35 -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:13.770 10:24:35 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:13.770 10:24:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:13.770 10:24:35 -- common/autotest_common.sh@10 -- # set +x 00:06:13.770 ************************************ 00:06:13.770 START TEST accel_xor 00:06:13.770 ************************************ 00:06:13.770 10:24:35 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w xor -y 00:06:13.770 10:24:35 -- accel/accel.sh@16 -- # local accel_opc 00:06:13.770 10:24:35 -- accel/accel.sh@17 -- # local accel_module 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:13.770 10:24:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:13.770 10:24:35 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:13.770 10:24:35 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:13.770 10:24:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:13.770 10:24:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:13.770 10:24:35 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:13.770 10:24:35 -- accel/accel.sh@40 -- # local IFS=, 00:06:13.770 10:24:35 -- accel/accel.sh@41 -- # jq -r . 00:06:13.770 [2024-04-19 10:24:35.673323] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:13.770 [2024-04-19 10:24:35.673411] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196656 ] 00:06:13.770 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.770 [2024-04-19 10:24:35.746490] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.770 [2024-04-19 10:24:35.829627] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val=0x1 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val=xor 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@23 -- # accel_opc=xor 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val=2 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:13.770 10:24:35 -- accel/accel.sh@20 -- # val=software 00:06:13.770 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:13.770 10:24:35 -- accel/accel.sh@22 -- # accel_module=software 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:13.770 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:14.029 10:24:35 -- accel/accel.sh@20 -- # val=32 00:06:14.029 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.029 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:14.029 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:14.029 10:24:35 -- accel/accel.sh@20 -- # val=32 00:06:14.029 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.029 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:14.029 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:14.029 10:24:35 -- accel/accel.sh@20 -- # val=1 00:06:14.029 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.029 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:14.029 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:14.029 10:24:35 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:14.029 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.029 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:14.030 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:14.030 10:24:35 -- accel/accel.sh@20 -- # val=Yes 00:06:14.030 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.030 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:14.030 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:14.030 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:14.030 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.030 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:14.030 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:14.030 10:24:35 -- accel/accel.sh@20 -- # val= 00:06:14.030 10:24:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.030 10:24:35 -- accel/accel.sh@19 -- # IFS=: 00:06:14.030 10:24:35 -- accel/accel.sh@19 -- # read -r var val 00:06:14.967 10:24:36 -- accel/accel.sh@20 -- # val= 00:06:14.967 10:24:36 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.967 10:24:36 -- accel/accel.sh@19 -- # IFS=: 00:06:14.967 10:24:36 -- accel/accel.sh@19 -- # read -r var val 00:06:14.967 10:24:36 -- accel/accel.sh@20 -- # val= 00:06:14.967 10:24:36 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.967 10:24:36 -- accel/accel.sh@19 -- # IFS=: 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:14.967 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:14.967 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:14.967 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:14.967 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:14.967 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:14.967 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:14.967 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:14.967 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:14.967 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:14.967 10:24:37 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:14.967 10:24:37 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:14.967 10:24:37 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:14.967 00:06:14.967 real 0m1.354s 00:06:14.967 user 0m1.232s 00:06:14.967 sys 0m0.138s 00:06:14.967 10:24:37 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:14.967 10:24:37 -- common/autotest_common.sh@10 -- # set +x 00:06:14.967 ************************************ 00:06:14.967 END TEST accel_xor 00:06:14.967 ************************************ 00:06:14.967 10:24:37 -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:14.967 10:24:37 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:14.967 10:24:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.967 10:24:37 -- common/autotest_common.sh@10 -- # set +x 00:06:15.227 ************************************ 00:06:15.227 START TEST accel_xor 00:06:15.227 ************************************ 00:06:15.227 10:24:37 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w xor -y -x 3 00:06:15.227 10:24:37 -- accel/accel.sh@16 -- # local accel_opc 00:06:15.227 10:24:37 -- accel/accel.sh@17 -- # local accel_module 00:06:15.227 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.227 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.227 10:24:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:15.227 10:24:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:15.227 10:24:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:15.227 10:24:37 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:15.227 10:24:37 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:15.227 10:24:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:15.227 10:24:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:15.227 10:24:37 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:15.227 10:24:37 -- accel/accel.sh@40 -- # local IFS=, 00:06:15.227 10:24:37 -- accel/accel.sh@41 -- # jq -r . 00:06:15.227 [2024-04-19 10:24:37.192986] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:15.227 [2024-04-19 10:24:37.193072] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196856 ] 00:06:15.227 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.227 [2024-04-19 10:24:37.266257] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.487 [2024-04-19 10:24:37.345846] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val=0x1 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val=xor 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@23 -- # accel_opc=xor 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val=3 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val=software 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@22 -- # accel_module=software 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val=32 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val=32 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val=1 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val=Yes 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:15.487 10:24:37 -- accel/accel.sh@20 -- # val= 00:06:15.487 10:24:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # IFS=: 00:06:15.487 10:24:37 -- accel/accel.sh@19 -- # read -r var val 00:06:16.426 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.426 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.426 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.426 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.426 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.426 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.426 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.426 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.426 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.426 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.426 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.426 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.426 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.427 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.427 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.427 10:24:38 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:16.427 10:24:38 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:16.427 10:24:38 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:16.427 00:06:16.427 real 0m1.351s 00:06:16.427 user 0m1.226s 00:06:16.427 sys 0m0.139s 00:06:16.427 10:24:38 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:16.427 10:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:16.427 ************************************ 00:06:16.427 END TEST accel_xor 00:06:16.427 ************************************ 00:06:16.686 10:24:38 -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:16.686 10:24:38 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:16.686 10:24:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.686 10:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:16.686 ************************************ 00:06:16.686 START TEST accel_dif_verify 00:06:16.687 ************************************ 00:06:16.687 10:24:38 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_verify 00:06:16.687 10:24:38 -- accel/accel.sh@16 -- # local accel_opc 00:06:16.687 10:24:38 -- accel/accel.sh@17 -- # local accel_module 00:06:16.687 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.687 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.687 10:24:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:16.687 10:24:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:16.687 10:24:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:16.687 10:24:38 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:16.687 10:24:38 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:16.687 10:24:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.687 10:24:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.687 10:24:38 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:16.687 10:24:38 -- accel/accel.sh@40 -- # local IFS=, 00:06:16.687 10:24:38 -- accel/accel.sh@41 -- # jq -r . 00:06:16.687 [2024-04-19 10:24:38.708478] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:16.687 [2024-04-19 10:24:38.708561] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197049 ] 00:06:16.687 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.687 [2024-04-19 10:24:38.781383] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.946 [2024-04-19 10:24:38.864183] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.946 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.946 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.946 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.946 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.946 10:24:38 -- accel/accel.sh@20 -- # val=0x1 00:06:16.946 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.946 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.946 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.946 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.946 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.946 10:24:38 -- accel/accel.sh@20 -- # val=dif_verify 00:06:16.946 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.946 10:24:38 -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.946 10:24:38 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:16.946 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.946 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.946 10:24:38 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val='512 bytes' 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val='8 bytes' 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val=software 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@22 -- # accel_module=software 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val=32 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val=32 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val=1 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val=No 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:16.947 10:24:38 -- accel/accel.sh@20 -- # val= 00:06:16.947 10:24:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # IFS=: 00:06:16.947 10:24:38 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:18.327 10:24:40 -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:06:18.327 10:24:40 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:18.327 00:06:18.327 real 0m1.356s 00:06:18.327 user 0m1.240s 00:06:18.327 sys 0m0.131s 00:06:18.327 10:24:40 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:18.327 10:24:40 -- common/autotest_common.sh@10 -- # set +x 00:06:18.327 ************************************ 00:06:18.327 END TEST accel_dif_verify 00:06:18.327 ************************************ 00:06:18.327 10:24:40 -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:18.327 10:24:40 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:18.327 10:24:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.327 10:24:40 -- common/autotest_common.sh@10 -- # set +x 00:06:18.327 ************************************ 00:06:18.327 START TEST accel_dif_generate 00:06:18.327 ************************************ 00:06:18.327 10:24:40 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_generate 00:06:18.327 10:24:40 -- accel/accel.sh@16 -- # local accel_opc 00:06:18.327 10:24:40 -- accel/accel.sh@17 -- # local accel_module 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:18.327 10:24:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:18.327 10:24:40 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.327 10:24:40 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:18.327 10:24:40 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:18.327 10:24:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.327 10:24:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.327 10:24:40 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:18.327 10:24:40 -- accel/accel.sh@40 -- # local IFS=, 00:06:18.327 10:24:40 -- accel/accel.sh@41 -- # jq -r . 00:06:18.327 [2024-04-19 10:24:40.221227] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:18.327 [2024-04-19 10:24:40.221314] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197280 ] 00:06:18.327 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.327 [2024-04-19 10:24:40.294182] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.327 [2024-04-19 10:24:40.372333] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val=0x1 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val=dif_generate 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val='512 bytes' 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val='8 bytes' 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.327 10:24:40 -- accel/accel.sh@20 -- # val=software 00:06:18.327 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.327 10:24:40 -- accel/accel.sh@22 -- # accel_module=software 00:06:18.327 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.328 10:24:40 -- accel/accel.sh@20 -- # val=32 00:06:18.328 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.328 10:24:40 -- accel/accel.sh@20 -- # val=32 00:06:18.328 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.328 10:24:40 -- accel/accel.sh@20 -- # val=1 00:06:18.328 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.328 10:24:40 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:18.328 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.328 10:24:40 -- accel/accel.sh@20 -- # val=No 00:06:18.328 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.328 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.328 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:18.328 10:24:40 -- accel/accel.sh@20 -- # val= 00:06:18.328 10:24:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # IFS=: 00:06:18.328 10:24:40 -- accel/accel.sh@19 -- # read -r var val 00:06:19.707 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.707 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.707 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.707 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.707 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.707 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.707 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.707 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.707 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.707 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.707 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.707 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.707 10:24:41 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:19.707 10:24:41 -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:06:19.707 10:24:41 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:19.707 00:06:19.707 real 0m1.347s 00:06:19.707 user 0m1.230s 00:06:19.707 sys 0m0.133s 00:06:19.707 10:24:41 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:19.707 10:24:41 -- common/autotest_common.sh@10 -- # set +x 00:06:19.707 ************************************ 00:06:19.707 END TEST accel_dif_generate 00:06:19.707 ************************************ 00:06:19.707 10:24:41 -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:19.707 10:24:41 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:19.707 10:24:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:19.707 10:24:41 -- common/autotest_common.sh@10 -- # set +x 00:06:19.707 ************************************ 00:06:19.707 START TEST accel_dif_generate_copy 00:06:19.707 ************************************ 00:06:19.707 10:24:41 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_generate_copy 00:06:19.707 10:24:41 -- accel/accel.sh@16 -- # local accel_opc 00:06:19.707 10:24:41 -- accel/accel.sh@17 -- # local accel_module 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.707 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.707 10:24:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:19.707 10:24:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:19.707 10:24:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:19.707 10:24:41 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:19.707 10:24:41 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:19.707 10:24:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:19.707 10:24:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:19.708 10:24:41 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:19.708 10:24:41 -- accel/accel.sh@40 -- # local IFS=, 00:06:19.708 10:24:41 -- accel/accel.sh@41 -- # jq -r . 00:06:19.708 [2024-04-19 10:24:41.717216] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:19.708 [2024-04-19 10:24:41.717305] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197551 ] 00:06:19.708 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.708 [2024-04-19 10:24:41.788273] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.967 [2024-04-19 10:24:41.869433] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val=0x1 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val=dif_generate_copy 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.967 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.967 10:24:41 -- accel/accel.sh@20 -- # val=software 00:06:19.967 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.968 10:24:41 -- accel/accel.sh@22 -- # accel_module=software 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.968 10:24:41 -- accel/accel.sh@20 -- # val=32 00:06:19.968 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.968 10:24:41 -- accel/accel.sh@20 -- # val=32 00:06:19.968 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.968 10:24:41 -- accel/accel.sh@20 -- # val=1 00:06:19.968 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.968 10:24:41 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:19.968 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.968 10:24:41 -- accel/accel.sh@20 -- # val=No 00:06:19.968 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.968 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.968 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:19.968 10:24:41 -- accel/accel.sh@20 -- # val= 00:06:19.968 10:24:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # IFS=: 00:06:19.968 10:24:41 -- accel/accel.sh@19 -- # read -r var val 00:06:21.347 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.347 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.347 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.347 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.347 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.347 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.347 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.347 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.347 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.347 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.347 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.347 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.347 10:24:43 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:21.347 10:24:43 -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:06:21.347 10:24:43 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:21.347 00:06:21.347 real 0m1.350s 00:06:21.347 user 0m1.228s 00:06:21.347 sys 0m0.135s 00:06:21.347 10:24:43 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:21.347 10:24:43 -- common/autotest_common.sh@10 -- # set +x 00:06:21.347 ************************************ 00:06:21.347 END TEST accel_dif_generate_copy 00:06:21.347 ************************************ 00:06:21.347 10:24:43 -- accel/accel.sh@115 -- # [[ y == y ]] 00:06:21.347 10:24:43 -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:21.347 10:24:43 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:21.347 10:24:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:21.347 10:24:43 -- common/autotest_common.sh@10 -- # set +x 00:06:21.347 ************************************ 00:06:21.347 START TEST accel_comp 00:06:21.347 ************************************ 00:06:21.347 10:24:43 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:21.347 10:24:43 -- accel/accel.sh@16 -- # local accel_opc 00:06:21.347 10:24:43 -- accel/accel.sh@17 -- # local accel_module 00:06:21.347 10:24:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.347 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.347 10:24:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:21.347 10:24:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:21.347 10:24:43 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:21.347 10:24:43 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:21.347 10:24:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:21.347 10:24:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:21.347 10:24:43 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:21.347 10:24:43 -- accel/accel.sh@40 -- # local IFS=, 00:06:21.347 10:24:43 -- accel/accel.sh@41 -- # jq -r . 00:06:21.347 [2024-04-19 10:24:43.189691] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:21.347 [2024-04-19 10:24:43.189751] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197805 ] 00:06:21.347 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.347 [2024-04-19 10:24:43.256662] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.347 [2024-04-19 10:24:43.332367] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val=0x1 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val=compress 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@23 -- # accel_opc=compress 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val=software 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@22 -- # accel_module=software 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val=32 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val=32 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val=1 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val=No 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:21.348 10:24:43 -- accel/accel.sh@20 -- # val= 00:06:21.348 10:24:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # IFS=: 00:06:21.348 10:24:43 -- accel/accel.sh@19 -- # read -r var val 00:06:22.728 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.728 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.728 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.728 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.728 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.728 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.728 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.728 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.728 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.728 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.728 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.728 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.728 10:24:44 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:22.728 10:24:44 -- accel/accel.sh@27 -- # [[ -n compress ]] 00:06:22.728 10:24:44 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:22.728 00:06:22.728 real 0m1.334s 00:06:22.728 user 0m1.220s 00:06:22.728 sys 0m0.129s 00:06:22.728 10:24:44 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:22.728 10:24:44 -- common/autotest_common.sh@10 -- # set +x 00:06:22.728 ************************************ 00:06:22.728 END TEST accel_comp 00:06:22.728 ************************************ 00:06:22.728 10:24:44 -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:22.728 10:24:44 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:22.728 10:24:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:22.728 10:24:44 -- common/autotest_common.sh@10 -- # set +x 00:06:22.728 ************************************ 00:06:22.728 START TEST accel_decomp 00:06:22.728 ************************************ 00:06:22.728 10:24:44 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:22.728 10:24:44 -- accel/accel.sh@16 -- # local accel_opc 00:06:22.728 10:24:44 -- accel/accel.sh@17 -- # local accel_module 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.728 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.728 10:24:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:22.728 10:24:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:22.728 10:24:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:22.728 10:24:44 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:22.728 10:24:44 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:22.728 10:24:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.728 10:24:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.728 10:24:44 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:22.728 10:24:44 -- accel/accel.sh@40 -- # local IFS=, 00:06:22.728 10:24:44 -- accel/accel.sh@41 -- # jq -r . 00:06:22.728 [2024-04-19 10:24:44.680686] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:22.728 [2024-04-19 10:24:44.680768] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197997 ] 00:06:22.728 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.728 [2024-04-19 10:24:44.752042] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.728 [2024-04-19 10:24:44.829576] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val=0x1 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val=decompress 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val=software 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@22 -- # accel_module=software 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val=32 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val=32 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val=1 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val=Yes 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:22.988 10:24:44 -- accel/accel.sh@20 -- # val= 00:06:22.988 10:24:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # IFS=: 00:06:22.988 10:24:44 -- accel/accel.sh@19 -- # read -r var val 00:06:23.926 10:24:45 -- accel/accel.sh@20 -- # val= 00:06:23.926 10:24:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:23.926 10:24:45 -- accel/accel.sh@19 -- # IFS=: 00:06:23.926 10:24:45 -- accel/accel.sh@19 -- # read -r var val 00:06:23.926 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:23.926 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:23.926 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:23.926 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:23.926 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:23.926 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:23.926 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:23.926 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:23.926 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:23.926 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:23.926 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:23.926 10:24:46 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:23.926 10:24:46 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:23.926 10:24:46 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:23.926 00:06:23.926 real 0m1.348s 00:06:23.926 user 0m1.226s 00:06:23.926 sys 0m0.136s 00:06:23.926 10:24:46 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:23.926 10:24:46 -- common/autotest_common.sh@10 -- # set +x 00:06:23.926 ************************************ 00:06:23.926 END TEST accel_decomp 00:06:23.926 ************************************ 00:06:24.186 10:24:46 -- accel/accel.sh@118 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:24.186 10:24:46 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:24.186 10:24:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.186 10:24:46 -- common/autotest_common.sh@10 -- # set +x 00:06:24.186 ************************************ 00:06:24.186 START TEST accel_decmop_full 00:06:24.186 ************************************ 00:06:24.186 10:24:46 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:24.186 10:24:46 -- accel/accel.sh@16 -- # local accel_opc 00:06:24.186 10:24:46 -- accel/accel.sh@17 -- # local accel_module 00:06:24.186 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.186 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.186 10:24:46 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:24.186 10:24:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.186 10:24:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:24.186 10:24:46 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:24.186 10:24:46 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:24.186 10:24:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.186 10:24:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.186 10:24:46 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:24.186 10:24:46 -- accel/accel.sh@40 -- # local IFS=, 00:06:24.186 10:24:46 -- accel/accel.sh@41 -- # jq -r . 00:06:24.186 [2024-04-19 10:24:46.173441] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:24.186 [2024-04-19 10:24:46.173528] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid198199 ] 00:06:24.186 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.186 [2024-04-19 10:24:46.244062] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.447 [2024-04-19 10:24:46.321743] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val=0x1 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val=decompress 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val=software 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@22 -- # accel_module=software 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val=32 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val=32 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val=1 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val=Yes 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:24.447 10:24:46 -- accel/accel.sh@20 -- # val= 00:06:24.447 10:24:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # IFS=: 00:06:24.447 10:24:46 -- accel/accel.sh@19 -- # read -r var val 00:06:25.386 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.386 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.386 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.386 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.386 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.386 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.386 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.646 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.646 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.646 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.646 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.646 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.646 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.646 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.646 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.646 10:24:47 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:25.646 10:24:47 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:25.646 10:24:47 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:25.646 00:06:25.646 real 0m1.350s 00:06:25.646 user 0m1.234s 00:06:25.646 sys 0m0.130s 00:06:25.646 10:24:47 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:25.646 10:24:47 -- common/autotest_common.sh@10 -- # set +x 00:06:25.646 ************************************ 00:06:25.646 END TEST accel_decmop_full 00:06:25.646 ************************************ 00:06:25.646 10:24:47 -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:25.646 10:24:47 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:25.646 10:24:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:25.646 10:24:47 -- common/autotest_common.sh@10 -- # set +x 00:06:25.646 ************************************ 00:06:25.646 START TEST accel_decomp_mcore 00:06:25.646 ************************************ 00:06:25.646 10:24:47 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:25.646 10:24:47 -- accel/accel.sh@16 -- # local accel_opc 00:06:25.646 10:24:47 -- accel/accel.sh@17 -- # local accel_module 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.646 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.646 10:24:47 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:25.646 10:24:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:25.646 10:24:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:25.646 10:24:47 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:25.646 10:24:47 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:25.646 10:24:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:25.646 10:24:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:25.646 10:24:47 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:25.646 10:24:47 -- accel/accel.sh@40 -- # local IFS=, 00:06:25.646 10:24:47 -- accel/accel.sh@41 -- # jq -r . 00:06:25.646 [2024-04-19 10:24:47.672064] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:25.646 [2024-04-19 10:24:47.672127] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid198394 ] 00:06:25.646 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.646 [2024-04-19 10:24:47.739313] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:25.905 [2024-04-19 10:24:47.821040] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:25.905 [2024-04-19 10:24:47.821127] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:25.905 [2024-04-19 10:24:47.821204] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:25.905 [2024-04-19 10:24:47.821206] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.905 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val=0xf 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val=decompress 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val=software 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@22 -- # accel_module=software 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val=32 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val=32 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val=1 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val=Yes 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:25.906 10:24:47 -- accel/accel.sh@20 -- # val= 00:06:25.906 10:24:47 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # IFS=: 00:06:25.906 10:24:47 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.285 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.285 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.285 10:24:49 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:27.285 10:24:49 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:27.285 10:24:49 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:27.285 00:06:27.286 real 0m1.354s 00:06:27.286 user 0m4.566s 00:06:27.286 sys 0m0.138s 00:06:27.286 10:24:49 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:27.286 10:24:49 -- common/autotest_common.sh@10 -- # set +x 00:06:27.286 ************************************ 00:06:27.286 END TEST accel_decomp_mcore 00:06:27.286 ************************************ 00:06:27.286 10:24:49 -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:27.286 10:24:49 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:27.286 10:24:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:27.286 10:24:49 -- common/autotest_common.sh@10 -- # set +x 00:06:27.286 ************************************ 00:06:27.286 START TEST accel_decomp_full_mcore 00:06:27.286 ************************************ 00:06:27.286 10:24:49 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:27.286 10:24:49 -- accel/accel.sh@16 -- # local accel_opc 00:06:27.286 10:24:49 -- accel/accel.sh@17 -- # local accel_module 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:27.286 10:24:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:27.286 10:24:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:27.286 10:24:49 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:27.286 10:24:49 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:27.286 10:24:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:27.286 10:24:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:27.286 10:24:49 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:27.286 10:24:49 -- accel/accel.sh@40 -- # local IFS=, 00:06:27.286 10:24:49 -- accel/accel.sh@41 -- # jq -r . 00:06:27.286 [2024-04-19 10:24:49.181312] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:27.286 [2024-04-19 10:24:49.181375] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid198596 ] 00:06:27.286 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.286 [2024-04-19 10:24:49.250310] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:27.286 [2024-04-19 10:24:49.329911] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:27.286 [2024-04-19 10:24:49.329999] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:27.286 [2024-04-19 10:24:49.330074] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:27.286 [2024-04-19 10:24:49.330076] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val=0xf 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val=decompress 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val=software 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@22 -- # accel_module=software 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val=32 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val=32 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val=1 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val=Yes 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:27.286 10:24:49 -- accel/accel.sh@20 -- # val= 00:06:27.286 10:24:49 -- accel/accel.sh@21 -- # case "$var" in 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # IFS=: 00:06:27.286 10:24:49 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.665 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.665 10:24:50 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:28.665 10:24:50 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:28.665 10:24:50 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:28.665 00:06:28.665 real 0m1.364s 00:06:28.665 user 0m4.606s 00:06:28.665 sys 0m0.134s 00:06:28.665 10:24:50 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:28.665 10:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:28.665 ************************************ 00:06:28.665 END TEST accel_decomp_full_mcore 00:06:28.665 ************************************ 00:06:28.665 10:24:50 -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:28.665 10:24:50 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:28.665 10:24:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:28.665 10:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:28.665 ************************************ 00:06:28.665 START TEST accel_decomp_mthread 00:06:28.665 ************************************ 00:06:28.665 10:24:50 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:28.665 10:24:50 -- accel/accel.sh@16 -- # local accel_opc 00:06:28.665 10:24:50 -- accel/accel.sh@17 -- # local accel_module 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.665 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.666 10:24:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:28.666 10:24:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.666 10:24:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:28.666 10:24:50 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:28.666 10:24:50 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:28.666 10:24:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.666 10:24:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.666 10:24:50 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:28.666 10:24:50 -- accel/accel.sh@40 -- # local IFS=, 00:06:28.666 10:24:50 -- accel/accel.sh@41 -- # jq -r . 00:06:28.666 [2024-04-19 10:24:50.720450] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:28.666 [2024-04-19 10:24:50.720537] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid198795 ] 00:06:28.666 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.925 [2024-04-19 10:24:50.794817] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.925 [2024-04-19 10:24:50.873219] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val=0x1 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val=decompress 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val=software 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@22 -- # accel_module=software 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val=32 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val=32 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val=2 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val=Yes 00:06:28.925 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.925 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.925 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.926 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.926 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.926 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:28.926 10:24:50 -- accel/accel.sh@20 -- # val= 00:06:28.926 10:24:50 -- accel/accel.sh@21 -- # case "$var" in 00:06:28.926 10:24:50 -- accel/accel.sh@19 -- # IFS=: 00:06:28.926 10:24:50 -- accel/accel.sh@19 -- # read -r var val 00:06:30.305 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.305 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.305 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.305 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.305 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.305 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.305 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.305 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.305 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.305 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.305 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.305 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.306 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.306 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.306 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.306 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.306 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.306 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.306 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.306 10:24:52 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:30.306 10:24:52 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:30.306 10:24:52 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:30.306 00:06:30.306 real 0m1.357s 00:06:30.306 user 0m1.240s 00:06:30.306 sys 0m0.131s 00:06:30.306 10:24:52 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:30.306 10:24:52 -- common/autotest_common.sh@10 -- # set +x 00:06:30.306 ************************************ 00:06:30.306 END TEST accel_decomp_mthread 00:06:30.306 ************************************ 00:06:30.306 10:24:52 -- accel/accel.sh@122 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:30.306 10:24:52 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:30.306 10:24:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:30.306 10:24:52 -- common/autotest_common.sh@10 -- # set +x 00:06:30.306 ************************************ 00:06:30.306 START TEST accel_deomp_full_mthread 00:06:30.306 ************************************ 00:06:30.306 10:24:52 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:30.306 10:24:52 -- accel/accel.sh@16 -- # local accel_opc 00:06:30.306 10:24:52 -- accel/accel.sh@17 -- # local accel_module 00:06:30.306 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.306 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.306 10:24:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:30.306 10:24:52 -- accel/accel.sh@12 -- # build_accel_config 00:06:30.306 10:24:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:30.306 10:24:52 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:30.306 10:24:52 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:30.306 10:24:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:30.306 10:24:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:30.306 10:24:52 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:30.306 10:24:52 -- accel/accel.sh@40 -- # local IFS=, 00:06:30.306 10:24:52 -- accel/accel.sh@41 -- # jq -r . 00:06:30.306 [2024-04-19 10:24:52.237671] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:30.306 [2024-04-19 10:24:52.237757] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199035 ] 00:06:30.306 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.306 [2024-04-19 10:24:52.310218] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.306 [2024-04-19 10:24:52.387247] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val=0x1 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val=decompress 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val=software 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@22 -- # accel_module=software 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val=32 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val=32 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val=2 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val=Yes 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:30.565 10:24:52 -- accel/accel.sh@20 -- # val= 00:06:30.565 10:24:52 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # IFS=: 00:06:30.565 10:24:52 -- accel/accel.sh@19 -- # read -r var val 00:06:31.503 10:24:53 -- accel/accel.sh@20 -- # val= 00:06:31.503 10:24:53 -- accel/accel.sh@21 -- # case "$var" in 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # IFS=: 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # read -r var val 00:06:31.503 10:24:53 -- accel/accel.sh@20 -- # val= 00:06:31.503 10:24:53 -- accel/accel.sh@21 -- # case "$var" in 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # IFS=: 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # read -r var val 00:06:31.503 10:24:53 -- accel/accel.sh@20 -- # val= 00:06:31.503 10:24:53 -- accel/accel.sh@21 -- # case "$var" in 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # IFS=: 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # read -r var val 00:06:31.503 10:24:53 -- accel/accel.sh@20 -- # val= 00:06:31.503 10:24:53 -- accel/accel.sh@21 -- # case "$var" in 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # IFS=: 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # read -r var val 00:06:31.503 10:24:53 -- accel/accel.sh@20 -- # val= 00:06:31.503 10:24:53 -- accel/accel.sh@21 -- # case "$var" in 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # IFS=: 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # read -r var val 00:06:31.503 10:24:53 -- accel/accel.sh@20 -- # val= 00:06:31.503 10:24:53 -- accel/accel.sh@21 -- # case "$var" in 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # IFS=: 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # read -r var val 00:06:31.503 10:24:53 -- accel/accel.sh@20 -- # val= 00:06:31.503 10:24:53 -- accel/accel.sh@21 -- # case "$var" in 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # IFS=: 00:06:31.503 10:24:53 -- accel/accel.sh@19 -- # read -r var val 00:06:31.503 10:24:53 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:31.503 10:24:53 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:31.503 10:24:53 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:31.503 00:06:31.503 real 0m1.368s 00:06:31.503 user 0m1.238s 00:06:31.503 sys 0m0.143s 00:06:31.503 10:24:53 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:31.503 10:24:53 -- common/autotest_common.sh@10 -- # set +x 00:06:31.503 ************************************ 00:06:31.503 END TEST accel_deomp_full_mthread 00:06:31.503 ************************************ 00:06:31.763 10:24:53 -- accel/accel.sh@124 -- # [[ n == y ]] 00:06:31.763 10:24:53 -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:31.763 10:24:53 -- accel/accel.sh@137 -- # build_accel_config 00:06:31.763 10:24:53 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:31.763 10:24:53 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:31.763 10:24:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.763 10:24:53 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:31.763 10:24:53 -- common/autotest_common.sh@10 -- # set +x 00:06:31.763 10:24:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.763 10:24:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.763 10:24:53 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:31.763 10:24:53 -- accel/accel.sh@40 -- # local IFS=, 00:06:31.763 10:24:53 -- accel/accel.sh@41 -- # jq -r . 00:06:31.763 ************************************ 00:06:31.763 START TEST accel_dif_functional_tests 00:06:31.763 ************************************ 00:06:31.763 10:24:53 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:31.763 [2024-04-19 10:24:53.760355] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:31.763 [2024-04-19 10:24:53.760435] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199330 ] 00:06:31.763 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.763 [2024-04-19 10:24:53.831148] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:32.022 [2024-04-19 10:24:53.909407] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:32.022 [2024-04-19 10:24:53.909513] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.022 [2024-04-19 10:24:53.909505] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:32.022 00:06:32.022 00:06:32.022 CUnit - A unit testing framework for C - Version 2.1-3 00:06:32.022 http://cunit.sourceforge.net/ 00:06:32.022 00:06:32.022 00:06:32.022 Suite: accel_dif 00:06:32.022 Test: verify: DIF generated, GUARD check ...passed 00:06:32.022 Test: verify: DIF generated, APPTAG check ...passed 00:06:32.022 Test: verify: DIF generated, REFTAG check ...passed 00:06:32.022 Test: verify: DIF not generated, GUARD check ...[2024-04-19 10:24:53.979606] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:32.022 [2024-04-19 10:24:53.979656] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:32.022 passed 00:06:32.022 Test: verify: DIF not generated, APPTAG check ...[2024-04-19 10:24:53.979690] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:32.022 [2024-04-19 10:24:53.979709] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:32.022 passed 00:06:32.022 Test: verify: DIF not generated, REFTAG check ...[2024-04-19 10:24:53.979733] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:32.022 [2024-04-19 10:24:53.979754] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:32.022 passed 00:06:32.022 Test: verify: APPTAG correct, APPTAG check ...passed 00:06:32.022 Test: verify: APPTAG incorrect, APPTAG check ...[2024-04-19 10:24:53.979798] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:06:32.022 passed 00:06:32.022 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:06:32.022 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:06:32.022 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:06:32.022 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-04-19 10:24:53.979906] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:06:32.022 passed 00:06:32.022 Test: generate copy: DIF generated, GUARD check ...passed 00:06:32.022 Test: generate copy: DIF generated, APTTAG check ...passed 00:06:32.022 Test: generate copy: DIF generated, REFTAG check ...passed 00:06:32.022 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:06:32.022 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:06:32.022 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:06:32.022 Test: generate copy: iovecs-len validate ...[2024-04-19 10:24:53.980078] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:06:32.022 passed 00:06:32.022 Test: generate copy: buffer alignment validate ...passed 00:06:32.022 00:06:32.022 Run Summary: Type Total Ran Passed Failed Inactive 00:06:32.022 suites 1 1 n/a 0 0 00:06:32.022 tests 20 20 20 0 0 00:06:32.022 asserts 204 204 204 0 n/a 00:06:32.022 00:06:32.022 Elapsed time = 0.002 seconds 00:06:32.281 00:06:32.281 real 0m0.407s 00:06:32.281 user 0m0.573s 00:06:32.281 sys 0m0.153s 00:06:32.281 10:24:54 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:32.281 10:24:54 -- common/autotest_common.sh@10 -- # set +x 00:06:32.281 ************************************ 00:06:32.281 END TEST accel_dif_functional_tests 00:06:32.281 ************************************ 00:06:32.281 00:06:32.281 real 0m33.610s 00:06:32.281 user 0m35.343s 00:06:32.281 sys 0m5.946s 00:06:32.281 10:24:54 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:32.281 10:24:54 -- common/autotest_common.sh@10 -- # set +x 00:06:32.281 ************************************ 00:06:32.281 END TEST accel 00:06:32.281 ************************************ 00:06:32.281 10:24:54 -- spdk/autotest.sh@180 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:32.281 10:24:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:32.281 10:24:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.281 10:24:54 -- common/autotest_common.sh@10 -- # set +x 00:06:32.282 ************************************ 00:06:32.282 START TEST accel_rpc 00:06:32.282 ************************************ 00:06:32.282 10:24:54 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:32.542 * Looking for test storage... 00:06:32.542 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:32.542 10:24:54 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:32.542 10:24:54 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=199422 00:06:32.542 10:24:54 -- accel/accel_rpc.sh@15 -- # waitforlisten 199422 00:06:32.542 10:24:54 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:06:32.542 10:24:54 -- common/autotest_common.sh@817 -- # '[' -z 199422 ']' 00:06:32.542 10:24:54 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:32.542 10:24:54 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:32.542 10:24:54 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:32.542 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:32.542 10:24:54 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:32.542 10:24:54 -- common/autotest_common.sh@10 -- # set +x 00:06:32.542 [2024-04-19 10:24:54.482157] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:32.542 [2024-04-19 10:24:54.482216] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199422 ] 00:06:32.542 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.542 [2024-04-19 10:24:54.550353] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.542 [2024-04-19 10:24:54.641204] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.483 10:24:55 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:33.483 10:24:55 -- common/autotest_common.sh@850 -- # return 0 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:06:33.483 10:24:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:33.483 10:24:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.483 10:24:55 -- common/autotest_common.sh@10 -- # set +x 00:06:33.483 ************************************ 00:06:33.483 START TEST accel_assign_opcode 00:06:33.483 ************************************ 00:06:33.483 10:24:55 -- common/autotest_common.sh@1111 -- # accel_assign_opcode_test_suite 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:06:33.483 10:24:55 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:33.483 10:24:55 -- common/autotest_common.sh@10 -- # set +x 00:06:33.483 [2024-04-19 10:24:55.375361] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:06:33.483 10:24:55 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:06:33.483 10:24:55 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:33.483 10:24:55 -- common/autotest_common.sh@10 -- # set +x 00:06:33.483 [2024-04-19 10:24:55.383355] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:06:33.483 10:24:55 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:06:33.483 10:24:55 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:33.483 10:24:55 -- common/autotest_common.sh@10 -- # set +x 00:06:33.483 10:24:55 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:06:33.483 10:24:55 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:33.483 10:24:55 -- common/autotest_common.sh@10 -- # set +x 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:06:33.483 10:24:55 -- accel/accel_rpc.sh@42 -- # grep software 00:06:33.483 10:24:55 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:33.777 software 00:06:33.777 00:06:33.777 real 0m0.232s 00:06:33.777 user 0m0.043s 00:06:33.777 sys 0m0.010s 00:06:33.777 10:24:55 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:33.777 10:24:55 -- common/autotest_common.sh@10 -- # set +x 00:06:33.777 ************************************ 00:06:33.777 END TEST accel_assign_opcode 00:06:33.777 ************************************ 00:06:33.777 10:24:55 -- accel/accel_rpc.sh@55 -- # killprocess 199422 00:06:33.777 10:24:55 -- common/autotest_common.sh@936 -- # '[' -z 199422 ']' 00:06:33.777 10:24:55 -- common/autotest_common.sh@940 -- # kill -0 199422 00:06:33.777 10:24:55 -- common/autotest_common.sh@941 -- # uname 00:06:33.777 10:24:55 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:33.777 10:24:55 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 199422 00:06:33.777 10:24:55 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:33.777 10:24:55 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:33.777 10:24:55 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 199422' 00:06:33.777 killing process with pid 199422 00:06:33.777 10:24:55 -- common/autotest_common.sh@955 -- # kill 199422 00:06:33.777 10:24:55 -- common/autotest_common.sh@960 -- # wait 199422 00:06:34.037 00:06:34.037 real 0m1.631s 00:06:34.037 user 0m1.676s 00:06:34.037 sys 0m0.481s 00:06:34.037 10:24:55 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:34.037 10:24:55 -- common/autotest_common.sh@10 -- # set +x 00:06:34.037 ************************************ 00:06:34.037 END TEST accel_rpc 00:06:34.037 ************************************ 00:06:34.037 10:24:56 -- spdk/autotest.sh@181 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:34.037 10:24:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:34.037 10:24:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.037 10:24:56 -- common/autotest_common.sh@10 -- # set +x 00:06:34.299 ************************************ 00:06:34.299 START TEST app_cmdline 00:06:34.299 ************************************ 00:06:34.299 10:24:56 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:34.299 * Looking for test storage... 00:06:34.299 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:34.299 10:24:56 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:06:34.299 10:24:56 -- app/cmdline.sh@17 -- # spdk_tgt_pid=199750 00:06:34.299 10:24:56 -- app/cmdline.sh@18 -- # waitforlisten 199750 00:06:34.299 10:24:56 -- common/autotest_common.sh@817 -- # '[' -z 199750 ']' 00:06:34.299 10:24:56 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.299 10:24:56 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:34.299 10:24:56 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.299 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.299 10:24:56 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:34.299 10:24:56 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:06:34.299 10:24:56 -- common/autotest_common.sh@10 -- # set +x 00:06:34.299 [2024-04-19 10:24:56.270714] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:34.299 [2024-04-19 10:24:56.270806] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199750 ] 00:06:34.299 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.299 [2024-04-19 10:24:56.342836] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.558 [2024-04-19 10:24:56.430822] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.127 10:24:57 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:35.127 10:24:57 -- common/autotest_common.sh@850 -- # return 0 00:06:35.127 10:24:57 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:06:35.127 { 00:06:35.127 "version": "SPDK v24.05-pre git sha1 3381d6e5b", 00:06:35.127 "fields": { 00:06:35.127 "major": 24, 00:06:35.127 "minor": 5, 00:06:35.127 "patch": 0, 00:06:35.127 "suffix": "-pre", 00:06:35.127 "commit": "3381d6e5b" 00:06:35.127 } 00:06:35.127 } 00:06:35.127 10:24:57 -- app/cmdline.sh@22 -- # expected_methods=() 00:06:35.127 10:24:57 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:06:35.127 10:24:57 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:06:35.127 10:24:57 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:06:35.388 10:24:57 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:06:35.388 10:24:57 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:06:35.388 10:24:57 -- app/cmdline.sh@26 -- # sort 00:06:35.388 10:24:57 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:35.388 10:24:57 -- common/autotest_common.sh@10 -- # set +x 00:06:35.388 10:24:57 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:35.388 10:24:57 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:06:35.388 10:24:57 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:06:35.388 10:24:57 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:35.388 10:24:57 -- common/autotest_common.sh@638 -- # local es=0 00:06:35.388 10:24:57 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:35.388 10:24:57 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:35.388 10:24:57 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:35.388 10:24:57 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:35.388 10:24:57 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:35.388 10:24:57 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:35.388 10:24:57 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:35.388 10:24:57 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:35.388 10:24:57 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:06:35.388 10:24:57 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:35.388 request: 00:06:35.388 { 00:06:35.388 "method": "env_dpdk_get_mem_stats", 00:06:35.388 "req_id": 1 00:06:35.388 } 00:06:35.388 Got JSON-RPC error response 00:06:35.388 response: 00:06:35.388 { 00:06:35.388 "code": -32601, 00:06:35.388 "message": "Method not found" 00:06:35.388 } 00:06:35.388 10:24:57 -- common/autotest_common.sh@641 -- # es=1 00:06:35.388 10:24:57 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:35.388 10:24:57 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:35.388 10:24:57 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:35.388 10:24:57 -- app/cmdline.sh@1 -- # killprocess 199750 00:06:35.388 10:24:57 -- common/autotest_common.sh@936 -- # '[' -z 199750 ']' 00:06:35.388 10:24:57 -- common/autotest_common.sh@940 -- # kill -0 199750 00:06:35.388 10:24:57 -- common/autotest_common.sh@941 -- # uname 00:06:35.388 10:24:57 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:35.388 10:24:57 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 199750 00:06:35.388 10:24:57 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:35.388 10:24:57 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:35.388 10:24:57 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 199750' 00:06:35.388 killing process with pid 199750 00:06:35.388 10:24:57 -- common/autotest_common.sh@955 -- # kill 199750 00:06:35.388 10:24:57 -- common/autotest_common.sh@960 -- # wait 199750 00:06:35.960 00:06:35.960 real 0m1.622s 00:06:35.960 user 0m1.867s 00:06:35.960 sys 0m0.464s 00:06:35.960 10:24:57 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:35.960 10:24:57 -- common/autotest_common.sh@10 -- # set +x 00:06:35.960 ************************************ 00:06:35.960 END TEST app_cmdline 00:06:35.960 ************************************ 00:06:35.960 10:24:57 -- spdk/autotest.sh@182 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:35.960 10:24:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:35.960 10:24:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.960 10:24:57 -- common/autotest_common.sh@10 -- # set +x 00:06:35.960 ************************************ 00:06:35.960 START TEST version 00:06:35.960 ************************************ 00:06:35.960 10:24:57 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:35.960 * Looking for test storage... 00:06:35.960 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:35.960 10:24:58 -- app/version.sh@17 -- # get_header_version major 00:06:35.960 10:24:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:35.960 10:24:58 -- app/version.sh@14 -- # cut -f2 00:06:35.960 10:24:58 -- app/version.sh@14 -- # tr -d '"' 00:06:35.960 10:24:58 -- app/version.sh@17 -- # major=24 00:06:35.960 10:24:58 -- app/version.sh@18 -- # get_header_version minor 00:06:35.960 10:24:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:35.960 10:24:58 -- app/version.sh@14 -- # cut -f2 00:06:35.960 10:24:58 -- app/version.sh@14 -- # tr -d '"' 00:06:35.960 10:24:58 -- app/version.sh@18 -- # minor=5 00:06:35.960 10:24:58 -- app/version.sh@19 -- # get_header_version patch 00:06:35.960 10:24:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:35.960 10:24:58 -- app/version.sh@14 -- # cut -f2 00:06:35.960 10:24:58 -- app/version.sh@14 -- # tr -d '"' 00:06:36.221 10:24:58 -- app/version.sh@19 -- # patch=0 00:06:36.221 10:24:58 -- app/version.sh@20 -- # get_header_version suffix 00:06:36.221 10:24:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:36.221 10:24:58 -- app/version.sh@14 -- # cut -f2 00:06:36.221 10:24:58 -- app/version.sh@14 -- # tr -d '"' 00:06:36.221 10:24:58 -- app/version.sh@20 -- # suffix=-pre 00:06:36.221 10:24:58 -- app/version.sh@22 -- # version=24.5 00:06:36.221 10:24:58 -- app/version.sh@25 -- # (( patch != 0 )) 00:06:36.221 10:24:58 -- app/version.sh@28 -- # version=24.5rc0 00:06:36.221 10:24:58 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:36.221 10:24:58 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:06:36.221 10:24:58 -- app/version.sh@30 -- # py_version=24.5rc0 00:06:36.221 10:24:58 -- app/version.sh@31 -- # [[ 24.5rc0 == \2\4\.\5\r\c\0 ]] 00:06:36.221 00:06:36.221 real 0m0.184s 00:06:36.221 user 0m0.093s 00:06:36.221 sys 0m0.139s 00:06:36.221 10:24:58 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:36.221 10:24:58 -- common/autotest_common.sh@10 -- # set +x 00:06:36.221 ************************************ 00:06:36.221 END TEST version 00:06:36.221 ************************************ 00:06:36.221 10:24:58 -- spdk/autotest.sh@184 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@194 -- # uname -s 00:06:36.221 10:24:58 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:06:36.221 10:24:58 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:06:36.221 10:24:58 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:06:36.221 10:24:58 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@254 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@258 -- # timing_exit lib 00:06:36.221 10:24:58 -- common/autotest_common.sh@716 -- # xtrace_disable 00:06:36.221 10:24:58 -- common/autotest_common.sh@10 -- # set +x 00:06:36.221 10:24:58 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@268 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@277 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@310 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@314 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@328 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@341 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@345 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@354 -- # '[' 0 -eq 1 ']' 00:06:36.221 10:24:58 -- spdk/autotest.sh@361 -- # [[ 0 -eq 1 ]] 00:06:36.221 10:24:58 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:06:36.221 10:24:58 -- spdk/autotest.sh@369 -- # [[ 1 -eq 1 ]] 00:06:36.221 10:24:58 -- spdk/autotest.sh@370 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:36.221 10:24:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:36.221 10:24:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.221 10:24:58 -- common/autotest_common.sh@10 -- # set +x 00:06:36.221 ************************************ 00:06:36.221 START TEST llvm_fuzz 00:06:36.221 ************************************ 00:06:36.221 10:24:58 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:36.483 * Looking for test storage... 00:06:36.483 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:06:36.483 10:24:58 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:06:36.483 10:24:58 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:06:36.483 10:24:58 -- common/autotest_common.sh@536 -- # fuzzers=() 00:06:36.483 10:24:58 -- common/autotest_common.sh@536 -- # local fuzzers 00:06:36.483 10:24:58 -- common/autotest_common.sh@538 -- # [[ -n '' ]] 00:06:36.483 10:24:58 -- common/autotest_common.sh@541 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:06:36.483 10:24:58 -- common/autotest_common.sh@542 -- # fuzzers=("${fuzzers[@]##*/}") 00:06:36.483 10:24:58 -- common/autotest_common.sh@545 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:06:36.483 10:24:58 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:06:36.483 10:24:58 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:06:36.483 10:24:58 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:06:36.483 10:24:58 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:36.483 10:24:58 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:36.483 10:24:58 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:36.483 10:24:58 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:36.483 10:24:58 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:36.483 10:24:58 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:36.483 10:24:58 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:36.483 10:24:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:36.483 10:24:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.483 10:24:58 -- common/autotest_common.sh@10 -- # set +x 00:06:36.483 ************************************ 00:06:36.483 START TEST nvmf_fuzz 00:06:36.483 ************************************ 00:06:36.483 10:24:58 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:36.747 * Looking for test storage... 00:06:36.747 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:36.747 10:24:58 -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:06:36.747 10:24:58 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:06:36.747 10:24:58 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:06:36.747 10:24:58 -- common/autotest_common.sh@34 -- # set -e 00:06:36.747 10:24:58 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:06:36.747 10:24:58 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:06:36.747 10:24:58 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:06:36.747 10:24:58 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:06:36.747 10:24:58 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:06:36.747 10:24:58 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:06:36.747 10:24:58 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:06:36.747 10:24:58 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:06:36.747 10:24:58 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:06:36.747 10:24:58 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:06:36.747 10:24:58 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:06:36.747 10:24:58 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:06:36.747 10:24:58 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:06:36.747 10:24:58 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:06:36.747 10:24:58 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:06:36.747 10:24:58 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:06:36.747 10:24:58 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:06:36.747 10:24:58 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:06:36.747 10:24:58 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:06:36.747 10:24:58 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:06:36.747 10:24:58 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:06:36.747 10:24:58 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:06:36.747 10:24:58 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:06:36.747 10:24:58 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:36.747 10:24:58 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:06:36.747 10:24:58 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:06:36.747 10:24:58 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:06:36.747 10:24:58 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:06:36.747 10:24:58 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:06:36.747 10:24:58 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:06:36.747 10:24:58 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:06:36.747 10:24:58 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:06:36.747 10:24:58 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:06:36.747 10:24:58 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:06:36.747 10:24:58 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:06:36.747 10:24:58 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:06:36.747 10:24:58 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:06:36.747 10:24:58 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:06:36.747 10:24:58 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:06:36.747 10:24:58 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:06:36.747 10:24:58 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:36.747 10:24:58 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:06:36.747 10:24:58 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:06:36.747 10:24:58 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:06:36.747 10:24:58 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:06:36.747 10:24:58 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:06:36.747 10:24:58 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:06:36.747 10:24:58 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:06:36.747 10:24:58 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:06:36.747 10:24:58 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:06:36.747 10:24:58 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:06:36.747 10:24:58 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:06:36.747 10:24:58 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:06:36.747 10:24:58 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:06:36.747 10:24:58 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:06:36.747 10:24:58 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:06:36.747 10:24:58 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:06:36.747 10:24:58 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:06:36.747 10:24:58 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:06:36.747 10:24:58 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:06:36.748 10:24:58 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:06:36.748 10:24:58 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:06:36.748 10:24:58 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:06:36.748 10:24:58 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:06:36.748 10:24:58 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:06:36.748 10:24:58 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:06:36.748 10:24:58 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR= 00:06:36.748 10:24:58 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:06:36.748 10:24:58 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:06:36.748 10:24:58 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:06:36.748 10:24:58 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:06:36.748 10:24:58 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:06:36.748 10:24:58 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:06:36.748 10:24:58 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:06:36.748 10:24:58 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:06:36.748 10:24:58 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:06:36.748 10:24:58 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:06:36.748 10:24:58 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:06:36.748 10:24:58 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:06:36.748 10:24:58 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:06:36.748 10:24:58 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:06:36.748 10:24:58 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:06:36.748 10:24:58 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:06:36.748 10:24:58 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:06:36.748 10:24:58 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:06:36.748 10:24:58 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:06:36.748 10:24:58 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:06:36.748 10:24:58 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:36.748 10:24:58 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:36.748 10:24:58 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:36.748 10:24:58 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:36.748 10:24:58 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:36.748 10:24:58 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:36.748 10:24:58 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:36.748 10:24:58 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:36.748 10:24:58 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:06:36.748 10:24:58 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:06:36.748 10:24:58 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:06:36.748 10:24:58 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:06:36.748 10:24:58 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:06:36.748 10:24:58 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:06:36.748 10:24:58 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:06:36.748 10:24:58 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:06:36.748 #define SPDK_CONFIG_H 00:06:36.748 #define SPDK_CONFIG_APPS 1 00:06:36.748 #define SPDK_CONFIG_ARCH native 00:06:36.748 #undef SPDK_CONFIG_ASAN 00:06:36.748 #undef SPDK_CONFIG_AVAHI 00:06:36.748 #undef SPDK_CONFIG_CET 00:06:36.748 #define SPDK_CONFIG_COVERAGE 1 00:06:36.748 #define SPDK_CONFIG_CROSS_PREFIX 00:06:36.748 #undef SPDK_CONFIG_CRYPTO 00:06:36.748 #undef SPDK_CONFIG_CRYPTO_MLX5 00:06:36.748 #undef SPDK_CONFIG_CUSTOMOCF 00:06:36.748 #undef SPDK_CONFIG_DAOS 00:06:36.748 #define SPDK_CONFIG_DAOS_DIR 00:06:36.748 #define SPDK_CONFIG_DEBUG 1 00:06:36.748 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:06:36.748 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:36.748 #define SPDK_CONFIG_DPDK_INC_DIR 00:06:36.748 #define SPDK_CONFIG_DPDK_LIB_DIR 00:06:36.748 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:06:36.748 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:36.748 #define SPDK_CONFIG_EXAMPLES 1 00:06:36.748 #undef SPDK_CONFIG_FC 00:06:36.748 #define SPDK_CONFIG_FC_PATH 00:06:36.748 #define SPDK_CONFIG_FIO_PLUGIN 1 00:06:36.748 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:06:36.748 #undef SPDK_CONFIG_FUSE 00:06:36.748 #define SPDK_CONFIG_FUZZER 1 00:06:36.748 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:06:36.748 #undef SPDK_CONFIG_GOLANG 00:06:36.748 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:06:36.748 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:06:36.748 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:06:36.748 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:06:36.748 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:06:36.748 #undef SPDK_CONFIG_HAVE_LIBBSD 00:06:36.748 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:06:36.748 #define SPDK_CONFIG_IDXD 1 00:06:36.748 #undef SPDK_CONFIG_IDXD_KERNEL 00:06:36.748 #undef SPDK_CONFIG_IPSEC_MB 00:06:36.748 #define SPDK_CONFIG_IPSEC_MB_DIR 00:06:36.748 #define SPDK_CONFIG_ISAL 1 00:06:36.748 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:06:36.748 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:06:36.748 #define SPDK_CONFIG_LIBDIR 00:06:36.748 #undef SPDK_CONFIG_LTO 00:06:36.748 #define SPDK_CONFIG_MAX_LCORES 00:06:36.748 #define SPDK_CONFIG_NVME_CUSE 1 00:06:36.748 #undef SPDK_CONFIG_OCF 00:06:36.748 #define SPDK_CONFIG_OCF_PATH 00:06:36.748 #define SPDK_CONFIG_OPENSSL_PATH 00:06:36.748 #undef SPDK_CONFIG_PGO_CAPTURE 00:06:36.748 #define SPDK_CONFIG_PGO_DIR 00:06:36.748 #undef SPDK_CONFIG_PGO_USE 00:06:36.748 #define SPDK_CONFIG_PREFIX /usr/local 00:06:36.748 #undef SPDK_CONFIG_RAID5F 00:06:36.748 #undef SPDK_CONFIG_RBD 00:06:36.748 #define SPDK_CONFIG_RDMA 1 00:06:36.748 #define SPDK_CONFIG_RDMA_PROV verbs 00:06:36.748 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:06:36.748 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:06:36.748 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:06:36.748 #undef SPDK_CONFIG_SHARED 00:06:36.748 #undef SPDK_CONFIG_SMA 00:06:36.748 #define SPDK_CONFIG_TESTS 1 00:06:36.748 #undef SPDK_CONFIG_TSAN 00:06:36.748 #define SPDK_CONFIG_UBLK 1 00:06:36.748 #define SPDK_CONFIG_UBSAN 1 00:06:36.748 #undef SPDK_CONFIG_UNIT_TESTS 00:06:36.748 #undef SPDK_CONFIG_URING 00:06:36.748 #define SPDK_CONFIG_URING_PATH 00:06:36.748 #undef SPDK_CONFIG_URING_ZNS 00:06:36.748 #undef SPDK_CONFIG_USDT 00:06:36.748 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:06:36.748 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:06:36.748 #define SPDK_CONFIG_VFIO_USER 1 00:06:36.748 #define SPDK_CONFIG_VFIO_USER_DIR 00:06:36.748 #define SPDK_CONFIG_VHOST 1 00:06:36.748 #define SPDK_CONFIG_VIRTIO 1 00:06:36.748 #undef SPDK_CONFIG_VTUNE 00:06:36.748 #define SPDK_CONFIG_VTUNE_DIR 00:06:36.748 #define SPDK_CONFIG_WERROR 1 00:06:36.748 #define SPDK_CONFIG_WPDK_DIR 00:06:36.748 #undef SPDK_CONFIG_XNVME 00:06:36.748 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:06:36.748 10:24:58 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:06:36.748 10:24:58 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:36.748 10:24:58 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:36.748 10:24:58 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:36.748 10:24:58 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:36.748 10:24:58 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:36.748 10:24:58 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:36.748 10:24:58 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:36.748 10:24:58 -- paths/export.sh@5 -- # export PATH 00:06:36.748 10:24:58 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:36.748 10:24:58 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:36.748 10:24:58 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:36.748 10:24:58 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:36.748 10:24:58 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:36.748 10:24:58 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:06:36.748 10:24:58 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:36.748 10:24:58 -- pm/common@67 -- # TEST_TAG=N/A 00:06:36.748 10:24:58 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:06:36.748 10:24:58 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:06:36.748 10:24:58 -- pm/common@71 -- # uname -s 00:06:36.748 10:24:58 -- pm/common@71 -- # PM_OS=Linux 00:06:36.748 10:24:58 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:06:36.748 10:24:58 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:06:36.748 10:24:58 -- pm/common@76 -- # [[ Linux == Linux ]] 00:06:36.748 10:24:58 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:06:36.748 10:24:58 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:06:36.748 10:24:58 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:06:36.749 10:24:58 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:06:36.749 10:24:58 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:06:36.749 10:24:58 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:06:36.749 10:24:58 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:06:36.749 10:24:58 -- common/autotest_common.sh@57 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:06:36.749 10:24:58 -- common/autotest_common.sh@61 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:06:36.749 10:24:58 -- common/autotest_common.sh@63 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:06:36.749 10:24:58 -- common/autotest_common.sh@65 -- # : 1 00:06:36.749 10:24:58 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:06:36.749 10:24:58 -- common/autotest_common.sh@67 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:06:36.749 10:24:58 -- common/autotest_common.sh@69 -- # : 00:06:36.749 10:24:58 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:06:36.749 10:24:58 -- common/autotest_common.sh@71 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:06:36.749 10:24:58 -- common/autotest_common.sh@73 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:06:36.749 10:24:58 -- common/autotest_common.sh@75 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:06:36.749 10:24:58 -- common/autotest_common.sh@77 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:06:36.749 10:24:58 -- common/autotest_common.sh@79 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:06:36.749 10:24:58 -- common/autotest_common.sh@81 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:06:36.749 10:24:58 -- common/autotest_common.sh@83 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:06:36.749 10:24:58 -- common/autotest_common.sh@85 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:06:36.749 10:24:58 -- common/autotest_common.sh@87 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:06:36.749 10:24:58 -- common/autotest_common.sh@89 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:06:36.749 10:24:58 -- common/autotest_common.sh@91 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:06:36.749 10:24:58 -- common/autotest_common.sh@93 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:06:36.749 10:24:58 -- common/autotest_common.sh@95 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:06:36.749 10:24:58 -- common/autotest_common.sh@97 -- # : 1 00:06:36.749 10:24:58 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:06:36.749 10:24:58 -- common/autotest_common.sh@99 -- # : 1 00:06:36.749 10:24:58 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:06:36.749 10:24:58 -- common/autotest_common.sh@101 -- # : rdma 00:06:36.749 10:24:58 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:06:36.749 10:24:58 -- common/autotest_common.sh@103 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:06:36.749 10:24:58 -- common/autotest_common.sh@105 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:06:36.749 10:24:58 -- common/autotest_common.sh@107 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:06:36.749 10:24:58 -- common/autotest_common.sh@109 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:06:36.749 10:24:58 -- common/autotest_common.sh@111 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:06:36.749 10:24:58 -- common/autotest_common.sh@113 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:06:36.749 10:24:58 -- common/autotest_common.sh@115 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:06:36.749 10:24:58 -- common/autotest_common.sh@117 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:06:36.749 10:24:58 -- common/autotest_common.sh@119 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:06:36.749 10:24:58 -- common/autotest_common.sh@121 -- # : 1 00:06:36.749 10:24:58 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:06:36.749 10:24:58 -- common/autotest_common.sh@123 -- # : 00:06:36.749 10:24:58 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:06:36.749 10:24:58 -- common/autotest_common.sh@125 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:06:36.749 10:24:58 -- common/autotest_common.sh@127 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:06:36.749 10:24:58 -- common/autotest_common.sh@129 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:06:36.749 10:24:58 -- common/autotest_common.sh@131 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:06:36.749 10:24:58 -- common/autotest_common.sh@133 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:06:36.749 10:24:58 -- common/autotest_common.sh@135 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:06:36.749 10:24:58 -- common/autotest_common.sh@137 -- # : 00:06:36.749 10:24:58 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:06:36.749 10:24:58 -- common/autotest_common.sh@139 -- # : true 00:06:36.749 10:24:58 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:06:36.749 10:24:58 -- common/autotest_common.sh@141 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:06:36.749 10:24:58 -- common/autotest_common.sh@143 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:06:36.749 10:24:58 -- common/autotest_common.sh@145 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:06:36.749 10:24:58 -- common/autotest_common.sh@147 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:06:36.749 10:24:58 -- common/autotest_common.sh@149 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:06:36.749 10:24:58 -- common/autotest_common.sh@151 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:06:36.749 10:24:58 -- common/autotest_common.sh@153 -- # : 00:06:36.749 10:24:58 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:06:36.749 10:24:58 -- common/autotest_common.sh@155 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:06:36.749 10:24:58 -- common/autotest_common.sh@157 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:06:36.749 10:24:58 -- common/autotest_common.sh@159 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:06:36.749 10:24:58 -- common/autotest_common.sh@161 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:06:36.749 10:24:58 -- common/autotest_common.sh@163 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:06:36.749 10:24:58 -- common/autotest_common.sh@166 -- # : 00:06:36.749 10:24:58 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:06:36.749 10:24:58 -- common/autotest_common.sh@168 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:06:36.749 10:24:58 -- common/autotest_common.sh@170 -- # : 0 00:06:36.749 10:24:58 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:06:36.749 10:24:58 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:36.749 10:24:58 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:36.749 10:24:58 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:36.749 10:24:58 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:36.749 10:24:58 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:36.749 10:24:58 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:36.749 10:24:58 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:36.749 10:24:58 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:36.749 10:24:58 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:06:36.749 10:24:58 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:06:36.749 10:24:58 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:36.750 10:24:58 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:36.750 10:24:58 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:06:36.750 10:24:58 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:06:36.750 10:24:58 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:36.750 10:24:58 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:36.750 10:24:58 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:36.750 10:24:58 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:36.750 10:24:58 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:06:36.750 10:24:58 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:06:36.750 10:24:58 -- common/autotest_common.sh@199 -- # cat 00:06:36.750 10:24:58 -- common/autotest_common.sh@225 -- # echo leak:libfuse3.so 00:06:36.750 10:24:58 -- common/autotest_common.sh@227 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:36.750 10:24:58 -- common/autotest_common.sh@227 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:36.750 10:24:58 -- common/autotest_common.sh@229 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:36.750 10:24:58 -- common/autotest_common.sh@229 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:36.750 10:24:58 -- common/autotest_common.sh@231 -- # '[' -z /var/spdk/dependencies ']' 00:06:36.750 10:24:58 -- common/autotest_common.sh@234 -- # export DEPENDENCY_DIR 00:06:36.750 10:24:58 -- common/autotest_common.sh@238 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:36.750 10:24:58 -- common/autotest_common.sh@238 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:36.750 10:24:58 -- common/autotest_common.sh@239 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:36.750 10:24:58 -- common/autotest_common.sh@239 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:36.750 10:24:58 -- common/autotest_common.sh@242 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:36.750 10:24:58 -- common/autotest_common.sh@242 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:36.750 10:24:58 -- common/autotest_common.sh@243 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:36.750 10:24:58 -- common/autotest_common.sh@243 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:36.750 10:24:58 -- common/autotest_common.sh@245 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:36.750 10:24:58 -- common/autotest_common.sh@245 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:36.750 10:24:58 -- common/autotest_common.sh@248 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:36.750 10:24:58 -- common/autotest_common.sh@248 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:36.750 10:24:58 -- common/autotest_common.sh@251 -- # '[' 0 -eq 0 ']' 00:06:36.750 10:24:58 -- common/autotest_common.sh@252 -- # export valgrind= 00:06:36.750 10:24:58 -- common/autotest_common.sh@252 -- # valgrind= 00:06:36.750 10:24:58 -- common/autotest_common.sh@258 -- # uname -s 00:06:36.750 10:24:58 -- common/autotest_common.sh@258 -- # '[' Linux = Linux ']' 00:06:36.750 10:24:58 -- common/autotest_common.sh@259 -- # HUGEMEM=4096 00:06:36.750 10:24:58 -- common/autotest_common.sh@260 -- # export CLEAR_HUGE=yes 00:06:36.750 10:24:58 -- common/autotest_common.sh@260 -- # CLEAR_HUGE=yes 00:06:36.750 10:24:58 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@268 -- # MAKE=make 00:06:36.750 10:24:58 -- common/autotest_common.sh@269 -- # MAKEFLAGS=-j72 00:06:36.750 10:24:58 -- common/autotest_common.sh@285 -- # export HUGEMEM=4096 00:06:36.750 10:24:58 -- common/autotest_common.sh@285 -- # HUGEMEM=4096 00:06:36.750 10:24:58 -- common/autotest_common.sh@287 -- # NO_HUGE=() 00:06:36.750 10:24:58 -- common/autotest_common.sh@288 -- # TEST_MODE= 00:06:36.750 10:24:58 -- common/autotest_common.sh@307 -- # [[ -z 200202 ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@307 -- # kill -0 200202 00:06:36.750 10:24:58 -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:06:36.750 10:24:58 -- common/autotest_common.sh@317 -- # [[ -v testdir ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@319 -- # local requested_size=2147483648 00:06:36.750 10:24:58 -- common/autotest_common.sh@320 -- # local mount target_dir 00:06:36.750 10:24:58 -- common/autotest_common.sh@322 -- # local -A mounts fss sizes avails uses 00:06:36.750 10:24:58 -- common/autotest_common.sh@323 -- # local source fs size avail mount use 00:06:36.750 10:24:58 -- common/autotest_common.sh@325 -- # local storage_fallback storage_candidates 00:06:36.750 10:24:58 -- common/autotest_common.sh@327 -- # mktemp -udt spdk.XXXXXX 00:06:36.750 10:24:58 -- common/autotest_common.sh@327 -- # storage_fallback=/tmp/spdk.PEtPkI 00:06:36.750 10:24:58 -- common/autotest_common.sh@332 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:06:36.750 10:24:58 -- common/autotest_common.sh@334 -- # [[ -n '' ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@339 -- # [[ -n '' ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@344 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.PEtPkI/tests/nvmf /tmp/spdk.PEtPkI 00:06:36.750 10:24:58 -- common/autotest_common.sh@347 -- # requested_size=2214592512 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@316 -- # df -T 00:06:36.750 10:24:58 -- common/autotest_common.sh@316 -- # grep -v Filesystem 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_devtmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # fss["$mount"]=devtmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # avails["$mount"]=67108864 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # sizes["$mount"]=67108864 00:06:36.750 10:24:58 -- common/autotest_common.sh@352 -- # uses["$mount"]=0 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/pmem0 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # fss["$mount"]=ext2 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # avails["$mount"]=995880960 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # sizes["$mount"]=5284429824 00:06:36.750 10:24:58 -- common/autotest_common.sh@352 -- # uses["$mount"]=4288548864 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_root 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # fss["$mount"]=overlay 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # avails["$mount"]=54710579200 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # sizes["$mount"]=61742718976 00:06:36.750 10:24:58 -- common/autotest_common.sh@352 -- # uses["$mount"]=7032139776 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/sda1 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # fss["$mount"]=xfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # avails["$mount"]=221821267968 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # sizes["$mount"]=239938535424 00:06:36.750 10:24:58 -- common/autotest_common.sh@352 -- # uses["$mount"]=18117267456 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # avails["$mount"]=30870081536 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # sizes["$mount"]=30871359488 00:06:36.750 10:24:58 -- common/autotest_common.sh@352 -- # uses["$mount"]=1277952 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # avails["$mount"]=12342808576 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # sizes["$mount"]=12348547072 00:06:36.750 10:24:58 -- common/autotest_common.sh@352 -- # uses["$mount"]=5738496 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # avails["$mount"]=30870974464 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # sizes["$mount"]=30871359488 00:06:36.750 10:24:58 -- common/autotest_common.sh@352 -- # uses["$mount"]=385024 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # avails["$mount"]=6174265344 00:06:36.750 10:24:58 -- common/autotest_common.sh@351 -- # sizes["$mount"]=6174269440 00:06:36.750 10:24:58 -- common/autotest_common.sh@352 -- # uses["$mount"]=4096 00:06:36.750 10:24:58 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:06:36.750 10:24:58 -- common/autotest_common.sh@355 -- # printf '* Looking for test storage...\n' 00:06:36.750 * Looking for test storage... 00:06:36.750 10:24:58 -- common/autotest_common.sh@357 -- # local target_space new_size 00:06:36.750 10:24:58 -- common/autotest_common.sh@358 -- # for target_dir in "${storage_candidates[@]}" 00:06:36.750 10:24:58 -- common/autotest_common.sh@361 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:36.750 10:24:58 -- common/autotest_common.sh@361 -- # awk '$1 !~ /Filesystem/{print $6}' 00:06:36.750 10:24:58 -- common/autotest_common.sh@361 -- # mount=/ 00:06:36.750 10:24:58 -- common/autotest_common.sh@363 -- # target_space=54710579200 00:06:36.750 10:24:58 -- common/autotest_common.sh@364 -- # (( target_space == 0 || target_space < requested_size )) 00:06:36.750 10:24:58 -- common/autotest_common.sh@367 -- # (( target_space >= requested_size )) 00:06:36.750 10:24:58 -- common/autotest_common.sh@369 -- # [[ overlay == tmpfs ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@369 -- # [[ overlay == ramfs ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@369 -- # [[ / == / ]] 00:06:36.750 10:24:58 -- common/autotest_common.sh@370 -- # new_size=9246732288 00:06:36.750 10:24:58 -- common/autotest_common.sh@371 -- # (( new_size * 100 / sizes[/] > 95 )) 00:06:36.750 10:24:58 -- common/autotest_common.sh@376 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:36.750 10:24:58 -- common/autotest_common.sh@376 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:36.751 10:24:58 -- common/autotest_common.sh@377 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:36.751 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:36.751 10:24:58 -- common/autotest_common.sh@378 -- # return 0 00:06:36.751 10:24:58 -- common/autotest_common.sh@1668 -- # set -o errtrace 00:06:36.751 10:24:58 -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:06:36.751 10:24:58 -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:06:36.751 10:24:58 -- common/autotest_common.sh@1672 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:06:36.751 10:24:58 -- common/autotest_common.sh@1673 -- # true 00:06:36.751 10:24:58 -- common/autotest_common.sh@1675 -- # xtrace_fd 00:06:36.751 10:24:58 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:06:36.751 10:24:58 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:06:36.751 10:24:58 -- common/autotest_common.sh@27 -- # exec 00:06:36.751 10:24:58 -- common/autotest_common.sh@29 -- # exec 00:06:36.751 10:24:58 -- common/autotest_common.sh@31 -- # xtrace_restore 00:06:36.751 10:24:58 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:06:36.751 10:24:58 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:06:36.751 10:24:58 -- common/autotest_common.sh@18 -- # set -x 00:06:36.751 10:24:58 -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:06:36.751 10:24:58 -- ../common.sh@8 -- # pids=() 00:06:36.751 10:24:58 -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:36.751 10:24:58 -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:36.751 10:24:58 -- nvmf/run.sh@64 -- # fuzz_num=25 00:06:36.751 10:24:58 -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:06:36.751 10:24:58 -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:06:36.751 10:24:58 -- nvmf/run.sh@69 -- # mem_size=512 00:06:36.751 10:24:58 -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:06:36.751 10:24:58 -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:06:36.751 10:24:58 -- ../common.sh@69 -- # local fuzz_num=25 00:06:36.751 10:24:58 -- ../common.sh@70 -- # local time=1 00:06:36.751 10:24:58 -- ../common.sh@72 -- # (( i = 0 )) 00:06:36.751 10:24:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:36.751 10:24:58 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:06:36.751 10:24:58 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:06:36.751 10:24:58 -- nvmf/run.sh@24 -- # local timen=1 00:06:36.751 10:24:58 -- nvmf/run.sh@25 -- # local core=0x1 00:06:36.751 10:24:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:36.751 10:24:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:06:36.751 10:24:58 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:36.751 10:24:58 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:36.751 10:24:58 -- nvmf/run.sh@34 -- # printf %02d 0 00:06:36.751 10:24:58 -- nvmf/run.sh@34 -- # port=4400 00:06:36.751 10:24:58 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:36.751 10:24:58 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:06:36.751 10:24:58 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:36.751 10:24:58 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:36.751 10:24:58 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:36.751 10:24:58 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:06:36.751 [2024-04-19 10:24:58.818183] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:36.751 [2024-04-19 10:24:58.818254] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200246 ] 00:06:36.751 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.012 [2024-04-19 10:24:59.002704] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.012 [2024-04-19 10:24:59.071473] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.271 [2024-04-19 10:24:59.130843] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:37.271 [2024-04-19 10:24:59.146978] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:06:37.271 INFO: Running with entropic power schedule (0xFF, 100). 00:06:37.271 INFO: Seed: 1927552938 00:06:37.271 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:37.271 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:37.271 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:37.271 INFO: A corpus is not provided, starting from an empty corpus 00:06:37.272 #2 INITED exec/s: 0 rss: 63Mb 00:06:37.272 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:37.272 This may also happen if the target rejected all inputs we tried so far 00:06:37.272 [2024-04-19 10:24:59.191751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.272 [2024-04-19 10:24:59.191787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.532 NEW_FUNC[1/669]: 0x481d00 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:06:37.532 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:37.532 #9 NEW cov: 11608 ft: 11609 corp: 2/117b lim: 320 exec/s: 0 rss: 69Mb L: 116/116 MS: 2 ChangeBit-InsertRepeatedBytes- 00:06:37.532 [2024-04-19 10:24:59.552661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.532 [2024-04-19 10:24:59.552707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.532 #10 NEW cov: 11738 ft: 12222 corp: 3/234b lim: 320 exec/s: 0 rss: 69Mb L: 117/117 MS: 1 CrossOver- 00:06:37.532 [2024-04-19 10:24:59.622708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.532 [2024-04-19 10:24:59.622744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.793 #11 NEW cov: 11744 ft: 12519 corp: 4/351b lim: 320 exec/s: 0 rss: 69Mb L: 117/117 MS: 1 ChangeBinInt- 00:06:37.793 [2024-04-19 10:24:59.692911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.793 [2024-04-19 10:24:59.692946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.793 #12 NEW cov: 11829 ft: 12781 corp: 5/468b lim: 320 exec/s: 0 rss: 69Mb L: 117/117 MS: 1 ShuffleBytes- 00:06:37.793 [2024-04-19 10:24:59.763079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.793 [2024-04-19 10:24:59.763113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.793 #13 NEW cov: 11829 ft: 12871 corp: 6/584b lim: 320 exec/s: 0 rss: 69Mb L: 116/117 MS: 1 ChangeBinInt- 00:06:37.793 [2024-04-19 10:24:59.813316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.793 [2024-04-19 10:24:59.813349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.793 [2024-04-19 10:24:59.813381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.793 [2024-04-19 10:24:59.813396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.793 [2024-04-19 10:24:59.813425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.793 [2024-04-19 10:24:59.813441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.793 #14 NEW cov: 11829 ft: 13196 corp: 7/778b lim: 320 exec/s: 0 rss: 69Mb L: 194/194 MS: 1 CrossOver- 00:06:37.793 [2024-04-19 10:24:59.863300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:37.793 [2024-04-19 10:24:59.863332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.053 #20 NEW cov: 11829 ft: 13285 corp: 8/895b lim: 320 exec/s: 0 rss: 70Mb L: 117/194 MS: 1 InsertByte- 00:06:38.054 [2024-04-19 10:24:59.933530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:75000000 cdw10:00000000 cdw11:00000000 00:06:38.054 [2024-04-19 10:24:59.933563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.054 #21 NEW cov: 11829 ft: 13322 corp: 9/1012b lim: 320 exec/s: 0 rss: 70Mb L: 117/194 MS: 1 ShuffleBytes- 00:06:38.054 [2024-04-19 10:24:59.983571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.054 [2024-04-19 10:24:59.983602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.054 #22 NEW cov: 11829 ft: 13353 corp: 10/1129b lim: 320 exec/s: 0 rss: 70Mb L: 117/194 MS: 1 ChangeBinInt- 00:06:38.054 [2024-04-19 10:25:00.053868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.054 [2024-04-19 10:25:00.053910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.054 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:38.054 #23 NEW cov: 11852 ft: 13415 corp: 11/1247b lim: 320 exec/s: 0 rss: 70Mb L: 118/194 MS: 1 CopyPart- 00:06:38.054 [2024-04-19 10:25:00.114763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.054 [2024-04-19 10:25:00.114822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.054 [2024-04-19 10:25:00.114894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.054 [2024-04-19 10:25:00.114920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.054 #24 NEW cov: 11852 ft: 13762 corp: 12/1426b lim: 320 exec/s: 0 rss: 70Mb L: 179/194 MS: 1 CrossOver- 00:06:38.054 [2024-04-19 10:25:00.155068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.054 [2024-04-19 10:25:00.155143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.054 [2024-04-19 10:25:00.155248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.054 [2024-04-19 10:25:00.155287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.314 #25 NEW cov: 11852 ft: 13920 corp: 13/1606b lim: 320 exec/s: 25 rss: 70Mb L: 180/194 MS: 1 InsertByte- 00:06:38.314 [2024-04-19 10:25:00.214922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.314 [2024-04-19 10:25:00.214949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.314 [2024-04-19 10:25:00.214999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.314 [2024-04-19 10:25:00.215012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.314 #26 NEW cov: 11852 ft: 14049 corp: 14/1754b lim: 320 exec/s: 26 rss: 70Mb L: 148/194 MS: 1 CopyPart- 00:06:38.314 [2024-04-19 10:25:00.255045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:1c1c0000 00:06:38.314 [2024-04-19 10:25:00.255070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.314 [2024-04-19 10:25:00.255123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: VIRTUALIZATION MANAGEMENT (1c) qid:0 cid:5 nsid:1c1c1c1c cdw10:00000000 cdw11:00000000 00:06:38.314 [2024-04-19 10:25:00.255137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.314 #27 NEW cov: 11853 ft: 14156 corp: 15/1911b lim: 320 exec/s: 27 rss: 70Mb L: 157/194 MS: 1 InsertRepeatedBytes- 00:06:38.314 [2024-04-19 10:25:00.305164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.314 [2024-04-19 10:25:00.305189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.314 [2024-04-19 10:25:00.305240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.314 [2024-04-19 10:25:00.305253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.314 #28 NEW cov: 11853 ft: 14183 corp: 16/2059b lim: 320 exec/s: 28 rss: 70Mb L: 148/194 MS: 1 ShuffleBytes- 00:06:38.314 [2024-04-19 10:25:00.345232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.314 [2024-04-19 10:25:00.345257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.314 [2024-04-19 10:25:00.345308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.314 [2024-04-19 10:25:00.345321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.314 #29 NEW cov: 11853 ft: 14223 corp: 17/2220b lim: 320 exec/s: 29 rss: 70Mb L: 161/194 MS: 1 CrossOver- 00:06:38.314 [2024-04-19 10:25:00.385287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.314 [2024-04-19 10:25:00.385312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.314 #30 NEW cov: 11853 ft: 14278 corp: 18/2336b lim: 320 exec/s: 30 rss: 70Mb L: 116/194 MS: 1 ChangeBit- 00:06:38.575 [2024-04-19 10:25:00.425364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.575 [2024-04-19 10:25:00.425389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.575 #31 NEW cov: 11853 ft: 14325 corp: 19/2453b lim: 320 exec/s: 31 rss: 70Mb L: 117/194 MS: 1 ChangeBit- 00:06:38.575 [2024-04-19 10:25:00.465494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.575 [2024-04-19 10:25:00.465518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.575 #32 NEW cov: 11853 ft: 14365 corp: 20/2569b lim: 320 exec/s: 32 rss: 70Mb L: 116/194 MS: 1 ChangeBit- 00:06:38.575 [2024-04-19 10:25:00.505790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:27272727 cdw11:27272727 00:06:38.575 [2024-04-19 10:25:00.505819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.575 [2024-04-19 10:25:00.505878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (27) qid:0 cid:5 nsid:27272727 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.575 [2024-04-19 10:25:00.505892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.575 [2024-04-19 10:25:00.505943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.575 [2024-04-19 10:25:00.505956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.575 NEW_FUNC[1/2]: 0x13042e0 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2027 00:06:38.575 NEW_FUNC[2/2]: 0x1708450 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:06:38.575 #33 NEW cov: 11907 ft: 14491 corp: 21/2776b lim: 320 exec/s: 33 rss: 70Mb L: 207/207 MS: 1 InsertRepeatedBytes- 00:06:38.575 [2024-04-19 10:25:00.555827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:1c1c0000 00:06:38.575 [2024-04-19 10:25:00.555852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.575 [2024-04-19 10:25:00.555906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: VIRTUALIZATION MANAGEMENT (1c) qid:0 cid:5 nsid:1c1c1c1c cdw10:00000000 cdw11:00000000 00:06:38.575 [2024-04-19 10:25:00.555920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.575 #34 NEW cov: 11907 ft: 14509 corp: 22/2933b lim: 320 exec/s: 34 rss: 70Mb L: 157/207 MS: 1 ChangeByte- 00:06:38.575 [2024-04-19 10:25:00.595923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.575 [2024-04-19 10:25:00.595951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.575 [2024-04-19 10:25:00.596004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.575 [2024-04-19 10:25:00.596017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.575 #35 NEW cov: 11907 ft: 14541 corp: 23/3095b lim: 320 exec/s: 35 rss: 70Mb L: 162/207 MS: 1 InsertByte- 00:06:38.575 [2024-04-19 10:25:00.645993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.575 [2024-04-19 10:25:00.646017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.575 #36 NEW cov: 11907 ft: 14552 corp: 24/3213b lim: 320 exec/s: 36 rss: 70Mb L: 118/207 MS: 1 InsertByte- 00:06:38.835 [2024-04-19 10:25:00.686060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.835 [2024-04-19 10:25:00.686085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.835 #37 NEW cov: 11907 ft: 14561 corp: 25/3329b lim: 320 exec/s: 37 rss: 70Mb L: 116/207 MS: 1 ShuffleBytes- 00:06:38.835 [2024-04-19 10:25:00.726218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:1c1c0000 00:06:38.835 [2024-04-19 10:25:00.726243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.835 #38 NEW cov: 11907 ft: 14574 corp: 26/3443b lim: 320 exec/s: 38 rss: 70Mb L: 114/207 MS: 1 EraseBytes- 00:06:38.835 [2024-04-19 10:25:00.776475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.835 [2024-04-19 10:25:00.776500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.835 [2024-04-19 10:25:00.776550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.835 [2024-04-19 10:25:00.776564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.835 #39 NEW cov: 11907 ft: 14587 corp: 27/3581b lim: 320 exec/s: 39 rss: 70Mb L: 138/207 MS: 1 CopyPart- 00:06:38.835 [2024-04-19 10:25:00.816609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:75000000 cdw10:00000000 cdw11:00000000 00:06:38.835 [2024-04-19 10:25:00.816633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.835 [2024-04-19 10:25:00.816702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: VIRTUALIZATION MANAGEMENT (1c) qid:0 cid:5 nsid:1c1c1c1c cdw10:00000000 cdw11:00000000 00:06:38.835 [2024-04-19 10:25:00.816715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.835 [2024-04-19 10:25:00.816765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.835 [2024-04-19 10:25:00.816778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.835 #40 NEW cov: 11907 ft: 14604 corp: 28/3792b lim: 320 exec/s: 40 rss: 70Mb L: 211/211 MS: 1 CrossOver- 00:06:38.835 [2024-04-19 10:25:00.856541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.835 [2024-04-19 10:25:00.856565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.835 #41 NEW cov: 11907 ft: 14612 corp: 29/3908b lim: 320 exec/s: 41 rss: 70Mb L: 116/211 MS: 1 ShuffleBytes- 00:06:38.835 [2024-04-19 10:25:00.896962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.835 [2024-04-19 10:25:00.896986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.836 [2024-04-19 10:25:00.897035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.836 [2024-04-19 10:25:00.897047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.836 [2024-04-19 10:25:00.897112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.836 [2024-04-19 10:25:00.897125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.836 [2024-04-19 10:25:00.897175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:06:38.836 [2024-04-19 10:25:00.897188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.836 #42 NEW cov: 11907 ft: 14838 corp: 30/4170b lim: 320 exec/s: 42 rss: 70Mb L: 262/262 MS: 1 CopyPart- 00:06:39.095 [2024-04-19 10:25:00.946886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:5b000000 cdw11:00000000 00:06:39.095 [2024-04-19 10:25:00.946910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.095 [2024-04-19 10:25:00.946962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.095 [2024-04-19 10:25:00.946975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.095 #43 NEW cov: 11907 ft: 14872 corp: 31/4319b lim: 320 exec/s: 43 rss: 70Mb L: 149/262 MS: 1 InsertByte- 00:06:39.095 [2024-04-19 10:25:00.987111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:75000000 cdw10:00000000 cdw11:00000000 00:06:39.095 [2024-04-19 10:25:00.987135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.095 [2024-04-19 10:25:00.987189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: VIRTUALIZATION MANAGEMENT (1c) qid:0 cid:5 nsid:1c1c1c1c cdw10:00000000 cdw11:00000000 00:06:39.095 [2024-04-19 10:25:00.987202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.095 [2024-04-19 10:25:00.987250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.095 [2024-04-19 10:25:00.987263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.095 #44 NEW cov: 11907 ft: 14876 corp: 32/4530b lim: 320 exec/s: 44 rss: 70Mb L: 211/262 MS: 1 ShuffleBytes- 00:06:39.095 [2024-04-19 10:25:01.027087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009400 cdw11:00000000 00:06:39.095 [2024-04-19 10:25:01.027110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.095 [2024-04-19 10:25:01.027161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.096 [2024-04-19 10:25:01.027173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.096 #45 NEW cov: 11907 ft: 14883 corp: 33/4678b lim: 320 exec/s: 45 rss: 70Mb L: 148/262 MS: 1 ChangeBinInt- 00:06:39.096 [2024-04-19 10:25:01.067188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.096 [2024-04-19 10:25:01.067211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.096 [2024-04-19 10:25:01.067266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.096 [2024-04-19 10:25:01.067279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.096 #46 NEW cov: 11907 ft: 14896 corp: 34/4827b lim: 320 exec/s: 46 rss: 70Mb L: 149/262 MS: 1 InsertByte- 00:06:39.096 [2024-04-19 10:25:01.107321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.096 [2024-04-19 10:25:01.107345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.096 [2024-04-19 10:25:01.107395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.096 [2024-04-19 10:25:01.107408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.096 #47 NEW cov: 11907 ft: 14909 corp: 35/4988b lim: 320 exec/s: 47 rss: 70Mb L: 161/262 MS: 1 ShuffleBytes- 00:06:39.096 [2024-04-19 10:25:01.147326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00730000 00:06:39.096 [2024-04-19 10:25:01.147349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.096 #48 NEW cov: 11907 ft: 14919 corp: 36/5105b lim: 320 exec/s: 48 rss: 71Mb L: 117/262 MS: 1 InsertByte- 00:06:39.096 [2024-04-19 10:25:01.187526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.096 [2024-04-19 10:25:01.187552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.096 [2024-04-19 10:25:01.187603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:3f cdw10:00000000 cdw11:00000000 00:06:39.096 [2024-04-19 10:25:01.187616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.357 #49 NEW cov: 11907 ft: 14931 corp: 37/5244b lim: 320 exec/s: 24 rss: 71Mb L: 139/262 MS: 1 InsertByte- 00:06:39.357 #49 DONE cov: 11907 ft: 14931 corp: 37/5244b lim: 320 exec/s: 24 rss: 71Mb 00:06:39.357 Done 49 runs in 2 second(s) 00:06:39.357 10:25:01 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:06:39.357 10:25:01 -- ../common.sh@72 -- # (( i++ )) 00:06:39.357 10:25:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:39.357 10:25:01 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:06:39.357 10:25:01 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:06:39.357 10:25:01 -- nvmf/run.sh@24 -- # local timen=1 00:06:39.357 10:25:01 -- nvmf/run.sh@25 -- # local core=0x1 00:06:39.357 10:25:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:39.357 10:25:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:06:39.357 10:25:01 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:39.357 10:25:01 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:39.357 10:25:01 -- nvmf/run.sh@34 -- # printf %02d 1 00:06:39.357 10:25:01 -- nvmf/run.sh@34 -- # port=4401 00:06:39.357 10:25:01 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:39.357 10:25:01 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:06:39.357 10:25:01 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:39.357 10:25:01 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:39.357 10:25:01 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:39.357 10:25:01 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:06:39.357 [2024-04-19 10:25:01.374325] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:39.357 [2024-04-19 10:25:01.374409] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200596 ] 00:06:39.357 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.618 [2024-04-19 10:25:01.557143] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.618 [2024-04-19 10:25:01.625073] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.618 [2024-04-19 10:25:01.684088] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:39.618 [2024-04-19 10:25:01.700225] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:06:39.618 INFO: Running with entropic power schedule (0xFF, 100). 00:06:39.618 INFO: Seed: 183590228 00:06:39.878 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:39.878 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:39.878 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:39.878 INFO: A corpus is not provided, starting from an empty corpus 00:06:39.878 #2 INITED exec/s: 0 rss: 62Mb 00:06:39.878 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:39.878 This may also happen if the target rejected all inputs we tried so far 00:06:39.878 [2024-04-19 10:25:01.771059] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc5 00:06:39.878 [2024-04-19 10:25:01.771502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.878 [2024-04-19 10:25:01.771540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.878 [2024-04-19 10:25:01.771634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.878 [2024-04-19 10:25:01.771651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.878 [2024-04-19 10:25:01.771745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.878 [2024-04-19 10:25:01.771762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.878 [2024-04-19 10:25:01.771834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.878 [2024-04-19 10:25:01.771851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.139 NEW_FUNC[1/671]: 0x482600 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:06:40.139 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:40.139 #12 NEW cov: 11731 ft: 11732 corp: 2/25b lim: 30 exec/s: 0 rss: 69Mb L: 24/24 MS: 5 ShuffleBytes-ChangeByte-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:06:40.139 [2024-04-19 10:25:02.090502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.139 [2024-04-19 10:25:02.090538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.139 [2024-04-19 10:25:02.090593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.139 [2024-04-19 10:25:02.090609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.139 [2024-04-19 10:25:02.090662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.139 [2024-04-19 10:25:02.090675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.139 [2024-04-19 10:25:02.090727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.139 [2024-04-19 10:25:02.090740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.139 #13 NEW cov: 11861 ft: 12414 corp: 3/50b lim: 30 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 InsertByte- 00:06:40.139 [2024-04-19 10:25:02.140170] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (512) > len (4) 00:06:40.139 [2024-04-19 10:25:02.140593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.139 [2024-04-19 10:25:02.140618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.139 [2024-04-19 10:25:02.140673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.140687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.140 [2024-04-19 10:25:02.140740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.140754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.140 [2024-04-19 10:25:02.140806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.140824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.140 #14 NEW cov: 11873 ft: 12649 corp: 4/79b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CMP- DE: "\002\000\000\000"- 00:06:40.140 [2024-04-19 10:25:02.180483] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc5 00:06:40.140 [2024-04-19 10:25:02.180701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.180726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.140 [2024-04-19 10:25:02.180778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.180792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.140 [2024-04-19 10:25:02.180846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.180860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.140 [2024-04-19 10:25:02.180912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00fc0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.180925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.140 #20 NEW cov: 11958 ft: 13033 corp: 5/103b lim: 30 exec/s: 0 rss: 69Mb L: 24/29 MS: 1 ChangeBinInt- 00:06:40.140 [2024-04-19 10:25:02.220411] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (1024) > len (4) 00:06:40.140 [2024-04-19 10:25:02.220820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.220847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.140 [2024-04-19 10:25:02.220900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.220915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.140 [2024-04-19 10:25:02.220969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.220982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.140 [2024-04-19 10:25:02.221035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.140 [2024-04-19 10:25:02.221049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.140 #21 NEW cov: 11958 ft: 13095 corp: 6/128b lim: 30 exec/s: 0 rss: 70Mb L: 25/29 MS: 1 ChangeBit- 00:06:40.400 [2024-04-19 10:25:02.260695] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc5 00:06:40.400 [2024-04-19 10:25:02.260919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0000001f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.400 [2024-04-19 10:25:02.260944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.400 [2024-04-19 10:25:02.260997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.400 [2024-04-19 10:25:02.261011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.400 [2024-04-19 10:25:02.261064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.400 [2024-04-19 10:25:02.261078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.261132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.261145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.401 #22 NEW cov: 11958 ft: 13238 corp: 7/152b lim: 30 exec/s: 0 rss: 70Mb L: 24/29 MS: 1 ChangeByte- 00:06:40.401 [2024-04-19 10:25:02.300432] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x6 00:06:40.401 [2024-04-19 10:25:02.300645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.300670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.401 #26 NEW cov: 11958 ft: 13967 corp: 8/158b lim: 30 exec/s: 0 rss: 70Mb L: 6/29 MS: 4 ShuffleBytes-CopyPart-ChangeBinInt-PersAutoDict- DE: "\002\000\000\000"- 00:06:40.401 [2024-04-19 10:25:02.340730] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (512) > len (4) 00:06:40.401 [2024-04-19 10:25:02.341170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.341197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.341252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.341266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.341320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.341334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.341388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.341401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.401 #27 NEW cov: 11958 ft: 13983 corp: 9/185b lim: 30 exec/s: 0 rss: 70Mb L: 27/29 MS: 1 EraseBytes- 00:06:40.401 [2024-04-19 10:25:02.380861] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (1024) > len (4) 00:06:40.401 [2024-04-19 10:25:02.381272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.381296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.381350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000db cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.381364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.381416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.381429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.381482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.381495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.401 #28 NEW cov: 11958 ft: 14026 corp: 10/210b lim: 30 exec/s: 0 rss: 70Mb L: 25/29 MS: 1 ChangeBit- 00:06:40.401 [2024-04-19 10:25:02.420866] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:40.401 [2024-04-19 10:25:02.420979] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:40.401 [2024-04-19 10:25:02.421089] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:40.401 [2024-04-19 10:25:02.421193] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:40.401 [2024-04-19 10:25:02.421407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.421431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.421486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.421499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.421556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.421570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.401 [2024-04-19 10:25:02.421622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.421635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.401 #29 NEW cov: 11966 ft: 14068 corp: 11/237b lim: 30 exec/s: 0 rss: 70Mb L: 27/29 MS: 1 InsertRepeatedBytes- 00:06:40.401 [2024-04-19 10:25:02.460872] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x6 00:06:40.401 [2024-04-19 10:25:02.461089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.401 [2024-04-19 10:25:02.461113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.401 #35 NEW cov: 11966 ft: 14137 corp: 12/243b lim: 30 exec/s: 0 rss: 70Mb L: 6/29 MS: 1 ChangeBinInt- 00:06:40.401 [2024-04-19 10:25:02.511162] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (12340) > buf size (4096) 00:06:40.662 [2024-04-19 10:25:02.511672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0c0c000c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.511697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.511753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.511767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.511824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.511838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.511893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.511907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.662 #36 NEW cov: 11966 ft: 14157 corp: 13/270b lim: 30 exec/s: 0 rss: 70Mb L: 27/29 MS: 1 InsertRepeatedBytes- 00:06:40.662 [2024-04-19 10:25:02.551244] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10252) > buf size (4096) 00:06:40.662 [2024-04-19 10:25:02.551760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.551784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.551838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.551852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.551904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.551918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.551974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.551987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.662 #37 NEW cov: 11966 ft: 14173 corp: 14/299b lim: 30 exec/s: 0 rss: 70Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:06:40.662 [2024-04-19 10:25:02.601253] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10252) > buf size (4096) 00:06:40.662 [2024-04-19 10:25:02.601484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a020006 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.601508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.662 #38 NEW cov: 11966 ft: 14198 corp: 15/305b lim: 30 exec/s: 0 rss: 70Mb L: 6/29 MS: 1 ChangeBinInt- 00:06:40.662 [2024-04-19 10:25:02.641560] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (512) > len (4) 00:06:40.662 [2024-04-19 10:25:02.641676] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (65540) > buf size (4096) 00:06:40.662 [2024-04-19 10:25:02.642002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.642027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.642082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.642095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.642148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:40000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.642162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.642216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.642230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.662 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:40.662 #39 NEW cov: 11989 ft: 14249 corp: 16/334b lim: 30 exec/s: 0 rss: 70Mb L: 29/29 MS: 1 ChangeBit- 00:06:40.662 [2024-04-19 10:25:02.681481] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:40.662 [2024-04-19 10:25:02.681695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.681720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.662 #40 NEW cov: 11989 ft: 14266 corp: 17/342b lim: 30 exec/s: 0 rss: 70Mb L: 8/29 MS: 1 CrossOver- 00:06:40.662 [2024-04-19 10:25:02.731960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.731984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.732038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.732052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.662 #41 NEW cov: 11989 ft: 14544 corp: 18/355b lim: 30 exec/s: 41 rss: 70Mb L: 13/29 MS: 1 EraseBytes- 00:06:40.662 [2024-04-19 10:25:02.772092] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (224260) > buf size (4096) 00:06:40.662 [2024-04-19 10:25:02.772419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.772444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.772500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.772514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.772567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:db000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.772581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.662 [2024-04-19 10:25:02.772635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.662 [2024-04-19 10:25:02.772648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.923 #42 NEW cov: 11989 ft: 14563 corp: 19/384b lim: 30 exec/s: 42 rss: 70Mb L: 29/29 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:06:40.923 [2024-04-19 10:25:02.811845] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3b 00:06:40.923 [2024-04-19 10:25:02.812057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.812081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.923 #43 NEW cov: 11989 ft: 14640 corp: 20/390b lim: 30 exec/s: 43 rss: 70Mb L: 6/29 MS: 1 ChangeByte- 00:06:40.923 [2024-04-19 10:25:02.852293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000010 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.852319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.923 [2024-04-19 10:25:02.852375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.852388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.923 #44 NEW cov: 11989 ft: 14743 corp: 21/403b lim: 30 exec/s: 44 rss: 70Mb L: 13/29 MS: 1 ChangeBit- 00:06:40.923 [2024-04-19 10:25:02.902162] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:40.923 [2024-04-19 10:25:02.902283] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:40.923 [2024-04-19 10:25:02.902396] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:40.923 [2024-04-19 10:25:02.902611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.902636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.923 [2024-04-19 10:25:02.902689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.902703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.923 [2024-04-19 10:25:02.902760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.902774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.923 #45 NEW cov: 11989 ft: 14991 corp: 22/422b lim: 30 exec/s: 45 rss: 70Mb L: 19/29 MS: 1 EraseBytes- 00:06:40.923 [2024-04-19 10:25:02.942549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.942573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.923 [2024-04-19 10:25:02.942628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.942642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.923 #46 NEW cov: 11989 ft: 15075 corp: 23/435b lim: 30 exec/s: 46 rss: 70Mb L: 13/29 MS: 1 ChangeBinInt- 00:06:40.923 [2024-04-19 10:25:02.982534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:02.982559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.923 #47 NEW cov: 11989 ft: 15175 corp: 24/442b lim: 30 exec/s: 47 rss: 71Mb L: 7/29 MS: 1 CrossOver- 00:06:40.923 [2024-04-19 10:25:03.032758] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (512) > len (4) 00:06:40.923 [2024-04-19 10:25:03.033085] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc5 00:06:40.923 [2024-04-19 10:25:03.033314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.923 [2024-04-19 10:25:03.033339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.924 [2024-04-19 10:25:03.033396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.924 [2024-04-19 10:25:03.033411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.924 [2024-04-19 10:25:03.033466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.924 [2024-04-19 10:25:03.033480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.924 [2024-04-19 10:25:03.033534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.924 [2024-04-19 10:25:03.033547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.924 [2024-04-19 10:25:03.033602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.924 [2024-04-19 10:25:03.033616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:41.184 #48 NEW cov: 11989 ft: 15225 corp: 25/472b lim: 30 exec/s: 48 rss: 71Mb L: 30/30 MS: 1 CopyPart- 00:06:41.184 [2024-04-19 10:25:03.072607] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10252) > buf size (4096) 00:06:41.184 [2024-04-19 10:25:03.072844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a020002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.072872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.184 #49 NEW cov: 11989 ft: 15252 corp: 26/478b lim: 30 exec/s: 49 rss: 71Mb L: 6/30 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:06:41.184 [2024-04-19 10:25:03.112746] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.184 [2024-04-19 10:25:03.112872] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000202 00:06:41.184 [2024-04-19 10:25:03.113105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.113130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.184 [2024-04-19 10:25:03.113185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.113200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.184 #50 NEW cov: 11989 ft: 15275 corp: 27/493b lim: 30 exec/s: 50 rss: 71Mb L: 15/30 MS: 1 InsertRepeatedBytes- 00:06:41.184 [2024-04-19 10:25:03.162878] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:41.184 [2024-04-19 10:25:03.163095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.163120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.184 #51 NEW cov: 11989 ft: 15284 corp: 28/501b lim: 30 exec/s: 51 rss: 71Mb L: 8/30 MS: 1 ChangeByte- 00:06:41.184 [2024-04-19 10:25:03.213063] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:41.184 [2024-04-19 10:25:03.213282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.213306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.184 #52 NEW cov: 11989 ft: 15295 corp: 29/509b lim: 30 exec/s: 52 rss: 71Mb L: 8/30 MS: 1 CopyPart- 00:06:41.184 [2024-04-19 10:25:03.253468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:000000b9 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.253493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.184 [2024-04-19 10:25:03.253548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.253561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.184 #58 NEW cov: 11989 ft: 15300 corp: 30/522b lim: 30 exec/s: 58 rss: 71Mb L: 13/30 MS: 1 ChangeByte- 00:06:41.184 [2024-04-19 10:25:03.293603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:000000b9 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.293628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.184 [2024-04-19 10:25:03.293685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.184 [2024-04-19 10:25:03.293698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.445 #59 NEW cov: 11989 ft: 15338 corp: 31/537b lim: 30 exec/s: 59 rss: 71Mb L: 15/30 MS: 1 CopyPart- 00:06:41.445 [2024-04-19 10:25:03.343473] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:06:41.445 [2024-04-19 10:25:03.343598] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:06:41.445 [2024-04-19 10:25:03.343711] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc5 00:06:41.445 [2024-04-19 10:25:03.343948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.343980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.344039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.344053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.344107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.344120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.445 #60 NEW cov: 11989 ft: 15340 corp: 32/555b lim: 30 exec/s: 60 rss: 71Mb L: 18/30 MS: 1 InsertRepeatedBytes- 00:06:41.445 [2024-04-19 10:25:03.383918] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc5 00:06:41.445 [2024-04-19 10:25:03.384166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0000001f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.384189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.384246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.384260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.384317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.384331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.384386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.384399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.445 #61 NEW cov: 11989 ft: 15355 corp: 33/579b lim: 30 exec/s: 61 rss: 71Mb L: 24/30 MS: 1 ShuffleBytes- 00:06:41.445 [2024-04-19 10:25:03.424033] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc5 00:06:41.445 [2024-04-19 10:25:03.424253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0000001f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.424277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.424331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.424344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.424399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.424412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.424469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.424482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.445 #62 NEW cov: 11989 ft: 15386 corp: 34/603b lim: 30 exec/s: 62 rss: 71Mb L: 24/30 MS: 1 ShuffleBytes- 00:06:41.445 [2024-04-19 10:25:03.463889] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:06:41.445 [2024-04-19 10:25:03.464012] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (512) > len (4) 00:06:41.445 [2024-04-19 10:25:03.464328] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc5 00:06:41.445 [2024-04-19 10:25:03.464540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.464565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.464622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.464636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.464692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.464706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.464759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.464772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.445 [2024-04-19 10:25:03.464827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.464841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:41.445 #63 NEW cov: 11989 ft: 15432 corp: 35/633b lim: 30 exec/s: 63 rss: 72Mb L: 30/30 MS: 1 ChangeByte- 00:06:41.445 [2024-04-19 10:25:03.503874] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (61684) > buf size (4096) 00:06:41.445 [2024-04-19 10:25:03.504086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3c3c003c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.504110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.445 #64 NEW cov: 11989 ft: 15440 corp: 36/639b lim: 30 exec/s: 64 rss: 72Mb L: 6/30 MS: 1 CrossOver- 00:06:41.445 [2024-04-19 10:25:03.554016] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (585972) > buf size (4096) 00:06:41.445 [2024-04-19 10:25:03.554236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3c3c023c cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.445 [2024-04-19 10:25:03.554260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.706 #65 NEW cov: 11989 ft: 15459 corp: 37/649b lim: 30 exec/s: 65 rss: 72Mb L: 10/30 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:06:41.706 [2024-04-19 10:25:03.604259] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000206 00:06:41.706 [2024-04-19 10:25:03.604777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.604805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.706 [2024-04-19 10:25:03.604866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.604880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.706 [2024-04-19 10:25:03.604932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.604946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.706 [2024-04-19 10:25:03.605000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.605013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.706 #66 NEW cov: 11989 ft: 15469 corp: 38/678b lim: 30 exec/s: 66 rss: 72Mb L: 29/30 MS: 1 CrossOver- 00:06:41.706 [2024-04-19 10:25:03.644288] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:06:41.706 [2024-04-19 10:25:03.644407] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261124) > buf size (4096) 00:06:41.706 [2024-04-19 10:25:03.644655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.644680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.706 [2024-04-19 10:25:03.644738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.644752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.706 #67 NEW cov: 11989 ft: 15529 corp: 39/693b lim: 30 exec/s: 67 rss: 72Mb L: 15/30 MS: 1 EraseBytes- 00:06:41.706 [2024-04-19 10:25:03.694632] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (512) > len (4) 00:06:41.706 [2024-04-19 10:25:03.695071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.695095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.706 [2024-04-19 10:25:03.695154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.695168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.706 [2024-04-19 10:25:03.695221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00040000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.695234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.706 [2024-04-19 10:25:03.695288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.695301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.706 #68 NEW cov: 11989 ft: 15556 corp: 40/722b lim: 30 exec/s: 68 rss: 72Mb L: 29/30 MS: 1 ChangeBit- 00:06:41.706 [2024-04-19 10:25:03.734833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.734860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.706 [2024-04-19 10:25:03.734916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:007e0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.706 [2024-04-19 10:25:03.734930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.706 #69 NEW cov: 11989 ft: 15562 corp: 41/736b lim: 30 exec/s: 34 rss: 72Mb L: 14/30 MS: 1 InsertByte- 00:06:41.706 #69 DONE cov: 11989 ft: 15562 corp: 41/736b lim: 30 exec/s: 34 rss: 72Mb 00:06:41.706 ###### Recommended dictionary. ###### 00:06:41.706 "\002\000\000\000" # Uses: 4 00:06:41.706 ###### End of recommended dictionary. ###### 00:06:41.706 Done 69 runs in 2 second(s) 00:06:41.966 10:25:03 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:06:41.966 10:25:03 -- ../common.sh@72 -- # (( i++ )) 00:06:41.966 10:25:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:41.966 10:25:03 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:06:41.966 10:25:03 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:06:41.966 10:25:03 -- nvmf/run.sh@24 -- # local timen=1 00:06:41.966 10:25:03 -- nvmf/run.sh@25 -- # local core=0x1 00:06:41.966 10:25:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:41.966 10:25:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:06:41.966 10:25:03 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:41.966 10:25:03 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:41.966 10:25:03 -- nvmf/run.sh@34 -- # printf %02d 2 00:06:41.966 10:25:03 -- nvmf/run.sh@34 -- # port=4402 00:06:41.966 10:25:03 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:41.966 10:25:03 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:06:41.966 10:25:03 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:41.966 10:25:03 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:41.966 10:25:03 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:41.966 10:25:03 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:06:41.966 [2024-04-19 10:25:03.912918] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:41.966 [2024-04-19 10:25:03.912993] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200941 ] 00:06:41.966 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.226 [2024-04-19 10:25:04.094802] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.226 [2024-04-19 10:25:04.162518] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.226 [2024-04-19 10:25:04.222118] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:42.226 [2024-04-19 10:25:04.238294] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:06:42.226 INFO: Running with entropic power schedule (0xFF, 100). 00:06:42.226 INFO: Seed: 2722599199 00:06:42.226 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:42.226 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:42.226 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:42.226 INFO: A corpus is not provided, starting from an empty corpus 00:06:42.226 #2 INITED exec/s: 0 rss: 62Mb 00:06:42.226 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:42.226 This may also happen if the target rejected all inputs we tried so far 00:06:42.226 [2024-04-19 10:25:04.305044] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:42.226 [2024-04-19 10:25:04.305478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.226 [2024-04-19 10:25:04.305514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.226 [2024-04-19 10:25:04.305603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.226 [2024-04-19 10:25:04.305622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.798 NEW_FUNC[1/670]: 0x4850b0 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:06:42.798 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:42.798 #5 NEW cov: 11664 ft: 11665 corp: 2/16b lim: 35 exec/s: 0 rss: 69Mb L: 15/15 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:06:42.798 [2024-04-19 10:25:04.636030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.636072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.798 #6 NEW cov: 11794 ft: 12539 corp: 3/28b lim: 35 exec/s: 0 rss: 69Mb L: 12/15 MS: 1 CrossOver- 00:06:42.798 [2024-04-19 10:25:04.696004] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:42.798 [2024-04-19 10:25:04.696488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.696516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.798 [2024-04-19 10:25:04.696612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.696630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.798 #7 NEW cov: 11800 ft: 12831 corp: 4/43b lim: 35 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 ChangeByte- 00:06:42.798 [2024-04-19 10:25:04.747146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a000a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.747171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.798 [2024-04-19 10:25:04.747259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.747277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.798 [2024-04-19 10:25:04.747365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.747380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.798 [2024-04-19 10:25:04.747470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1a0a001a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.747484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.798 #8 NEW cov: 11885 ft: 13614 corp: 5/77b lim: 35 exec/s: 0 rss: 70Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:06:42.798 [2024-04-19 10:25:04.806383] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:42.798 [2024-04-19 10:25:04.806817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.806844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.798 [2024-04-19 10:25:04.806925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00040000 cdw11:2b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.806944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.798 #9 NEW cov: 11885 ft: 13686 corp: 6/92b lim: 35 exec/s: 0 rss: 70Mb L: 15/34 MS: 1 ChangeBit- 00:06:42.798 [2024-04-19 10:25:04.856682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.856709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.798 #10 NEW cov: 11885 ft: 13721 corp: 7/101b lim: 35 exec/s: 0 rss: 70Mb L: 9/34 MS: 1 CrossOver- 00:06:42.798 [2024-04-19 10:25:04.907732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e5e500f1 cdw11:e500e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.907758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.798 [2024-04-19 10:25:04.907846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a00e5 cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.907864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.798 [2024-04-19 10:25:04.907953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.907969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.798 [2024-04-19 10:25:04.908063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1a0a001a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.798 [2024-04-19 10:25:04.908078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:43.059 #11 NEW cov: 11885 ft: 13790 corp: 8/135b lim: 35 exec/s: 0 rss: 70Mb L: 34/34 MS: 1 ChangeBinInt- 00:06:43.059 [2024-04-19 10:25:04.967157] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.059 [2024-04-19 10:25:04.967588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:04.967612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.059 [2024-04-19 10:25:04.967707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a4a400a4 cdw11:a400a4a4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:04.967723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.059 [2024-04-19 10:25:04.967814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:04.967832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.059 #12 NEW cov: 11885 ft: 13991 corp: 9/157b lim: 35 exec/s: 0 rss: 70Mb L: 22/34 MS: 1 InsertRepeatedBytes- 00:06:43.059 [2024-04-19 10:25:05.017247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:05.017277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.059 #13 NEW cov: 11885 ft: 14064 corp: 10/165b lim: 35 exec/s: 0 rss: 70Mb L: 8/34 MS: 1 EraseBytes- 00:06:43.059 [2024-04-19 10:25:05.067308] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.059 [2024-04-19 10:25:05.067744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:05.067771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.059 [2024-04-19 10:25:05.067869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:05.067890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.059 #14 NEW cov: 11885 ft: 14099 corp: 11/180b lim: 35 exec/s: 0 rss: 70Mb L: 15/34 MS: 1 ShuffleBytes- 00:06:43.059 [2024-04-19 10:25:05.117580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d4d400d4 cdw11:d400d4d4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:05.117604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.059 #17 NEW cov: 11885 ft: 14118 corp: 12/192b lim: 35 exec/s: 0 rss: 70Mb L: 12/34 MS: 3 CopyPart-InsertByte-InsertRepeatedBytes- 00:06:43.059 [2024-04-19 10:25:05.168722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e5e500f1 cdw11:e500e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:05.168748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.059 [2024-04-19 10:25:05.168863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a00e5 cdw11:e6001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:05.168886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.059 [2024-04-19 10:25:05.168981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:05.168998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.059 [2024-04-19 10:25:05.169084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1a0a001a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.059 [2024-04-19 10:25:05.169099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:43.320 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:43.320 #18 NEW cov: 11908 ft: 14183 corp: 13/226b lim: 35 exec/s: 0 rss: 70Mb L: 34/34 MS: 1 ChangeBinInt- 00:06:43.320 [2024-04-19 10:25:05.228084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff18000a cdw11:7d00f906 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.320 [2024-04-19 10:25:05.228109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.320 #19 NEW cov: 11908 ft: 14208 corp: 14/235b lim: 35 exec/s: 0 rss: 70Mb L: 9/34 MS: 1 CMP- DE: "\377\030\371\006}\313`\364"- 00:06:43.320 [2024-04-19 10:25:05.278134] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.320 [2024-04-19 10:25:05.278599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.320 [2024-04-19 10:25:05.278631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.320 [2024-04-19 10:25:05.278728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2b000060 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.320 [2024-04-19 10:25:05.278748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.320 #20 NEW cov: 11908 ft: 14278 corp: 15/250b lim: 35 exec/s: 20 rss: 70Mb L: 15/34 MS: 1 ChangeByte- 00:06:43.320 [2024-04-19 10:25:05.328344] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.320 [2024-04-19 10:25:05.328813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00009000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.320 [2024-04-19 10:25:05.328839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.320 [2024-04-19 10:25:05.328933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2b000060 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.320 [2024-04-19 10:25:05.328953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.320 #21 NEW cov: 11908 ft: 14294 corp: 16/265b lim: 35 exec/s: 21 rss: 70Mb L: 15/34 MS: 1 ChangeByte- 00:06:43.320 [2024-04-19 10:25:05.379191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.320 [2024-04-19 10:25:05.379216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.320 [2024-04-19 10:25:05.379311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:e6e600e6 cdw11:e600e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.320 [2024-04-19 10:25:05.379328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.320 [2024-04-19 10:25:05.379416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:e6e600e6 cdw11:e600e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.320 [2024-04-19 10:25:05.379431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.320 #22 NEW cov: 11908 ft: 14306 corp: 17/288b lim: 35 exec/s: 22 rss: 70Mb L: 23/34 MS: 1 InsertRepeatedBytes- 00:06:43.581 [2024-04-19 10:25:05.438818] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.581 [2024-04-19 10:25:05.439305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0800000a cdw11:00009000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.439333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.581 [2024-04-19 10:25:05.439427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2b000060 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.439446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.581 #23 NEW cov: 11908 ft: 14323 corp: 18/303b lim: 35 exec/s: 23 rss: 71Mb L: 15/34 MS: 1 ChangeBit- 00:06:43.581 [2024-04-19 10:25:05.500002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:006000f1 cdw11:e500e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.500028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.581 [2024-04-19 10:25:05.500114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a00e5 cdw11:e6001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.500133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.581 [2024-04-19 10:25:05.500225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.500240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.581 [2024-04-19 10:25:05.500335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1a0a001a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.500350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:43.581 #24 NEW cov: 11908 ft: 14346 corp: 19/337b lim: 35 exec/s: 24 rss: 71Mb L: 34/34 MS: 1 CrossOver- 00:06:43.581 [2024-04-19 10:25:05.559259] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.581 [2024-04-19 10:25:05.559697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000063 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.559724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.581 [2024-04-19 10:25:05.559806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.559829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.581 #25 NEW cov: 11908 ft: 14353 corp: 20/352b lim: 35 exec/s: 25 rss: 71Mb L: 15/34 MS: 1 ChangeByte- 00:06:43.581 [2024-04-19 10:25:05.610252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e5e500f1 cdw11:e500e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.610278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.581 [2024-04-19 10:25:05.610368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a00e5 cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.610385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.581 [2024-04-19 10:25:05.610478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.610495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.581 #26 NEW cov: 11908 ft: 14385 corp: 21/376b lim: 35 exec/s: 26 rss: 71Mb L: 24/34 MS: 1 EraseBytes- 00:06:43.581 [2024-04-19 10:25:05.659781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d4d400d4 cdw11:0000d400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.581 [2024-04-19 10:25:05.659813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.581 #27 NEW cov: 11908 ft: 14401 corp: 22/388b lim: 35 exec/s: 27 rss: 71Mb L: 12/34 MS: 1 CrossOver- 00:06:43.842 [2024-04-19 10:25:05.720584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff18000a cdw11:7d00f906 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.720612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.720711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7b7b0060 cdw11:7b007b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.720729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.720819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:7b7b007b cdw11:7b007b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.720837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.842 #33 NEW cov: 11908 ft: 14416 corp: 23/411b lim: 35 exec/s: 33 rss: 71Mb L: 23/34 MS: 1 InsertRepeatedBytes- 00:06:43.842 [2024-04-19 10:25:05.781212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e5e500f1 cdw11:e500e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.781239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.781339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1ae500e5 cdw11:e6001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.781357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.781441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.781457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.781549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1a0a001a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.781564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:43.842 #34 NEW cov: 11908 ft: 14505 corp: 24/445b lim: 35 exec/s: 34 rss: 71Mb L: 34/34 MS: 1 ShuffleBytes- 00:06:43.842 [2024-04-19 10:25:05.830478] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.842 [2024-04-19 10:25:05.830733] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.842 [2024-04-19 10:25:05.830996] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.842 [2024-04-19 10:25:05.831441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.831467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.831557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.831574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.831665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.831683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.831771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.831786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:43.842 #35 NEW cov: 11908 ft: 14568 corp: 25/478b lim: 35 exec/s: 35 rss: 71Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:06:43.842 [2024-04-19 10:25:05.890608] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.842 [2024-04-19 10:25:05.891087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.891117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.891209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00002b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.891227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.842 #36 NEW cov: 11908 ft: 14659 corp: 26/492b lim: 35 exec/s: 36 rss: 72Mb L: 14/34 MS: 1 CrossOver- 00:06:43.842 [2024-04-19 10:25:05.940769] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.842 [2024-04-19 10:25:05.941203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.941242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.842 [2024-04-19 10:25:05.941332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000a0000 cdw11:6b00006b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.842 [2024-04-19 10:25:05.941349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.102 #37 NEW cov: 11908 ft: 14675 corp: 27/511b lim: 35 exec/s: 37 rss: 72Mb L: 19/34 MS: 1 InsertRepeatedBytes- 00:06:44.102 [2024-04-19 10:25:05.991852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e5e500f1 cdw11:e500e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.102 [2024-04-19 10:25:05.991877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.102 [2024-04-19 10:25:05.991963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1ae500e5 cdw11:e6001c1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.102 [2024-04-19 10:25:05.991979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.102 [2024-04-19 10:25:05.992082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.102 [2024-04-19 10:25:05.992098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.102 [2024-04-19 10:25:05.992190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1a0a001a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.102 [2024-04-19 10:25:05.992206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:44.102 #38 NEW cov: 11908 ft: 14710 corp: 28/545b lim: 35 exec/s: 38 rss: 72Mb L: 34/34 MS: 1 ChangeBinInt- 00:06:44.102 [2024-04-19 10:25:06.051304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.102 [2024-04-19 10:25:06.051329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.102 #39 NEW cov: 11908 ft: 14735 corp: 29/557b lim: 35 exec/s: 39 rss: 72Mb L: 12/34 MS: 1 CrossOver- 00:06:44.102 [2024-04-19 10:25:06.101271] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.102 [2024-04-19 10:25:06.101711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.102 [2024-04-19 10:25:06.101737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.102 [2024-04-19 10:25:06.101827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.102 [2024-04-19 10:25:06.101862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.102 #40 NEW cov: 11908 ft: 14738 corp: 30/572b lim: 35 exec/s: 40 rss: 72Mb L: 15/34 MS: 1 CrossOver- 00:06:44.103 [2024-04-19 10:25:06.151555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.103 [2024-04-19 10:25:06.151580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.103 #41 NEW cov: 11908 ft: 14750 corp: 31/584b lim: 35 exec/s: 41 rss: 72Mb L: 12/34 MS: 1 ChangeByte- 00:06:44.103 [2024-04-19 10:25:06.202007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:e600f4d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.103 [2024-04-19 10:25:06.202032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.103 [2024-04-19 10:25:06.202125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f9190007 cdw11:2b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.103 [2024-04-19 10:25:06.202142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.363 #42 NEW cov: 11908 ft: 14773 corp: 32/599b lim: 35 exec/s: 42 rss: 72Mb L: 15/34 MS: 1 CMP- DE: "\364\323\346\002\007\371\031\000"- 00:06:44.363 [2024-04-19 10:25:06.252040] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.363 [2024-04-19 10:25:06.252495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:f4000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.364 [2024-04-19 10:25:06.252521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.364 [2024-04-19 10:25:06.252614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:020700e6 cdw11:0000f919 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.364 [2024-04-19 10:25:06.252630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.364 [2024-04-19 10:25:06.252727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.364 [2024-04-19 10:25:06.252744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.364 #43 NEW cov: 11908 ft: 14782 corp: 33/622b lim: 35 exec/s: 43 rss: 72Mb L: 23/34 MS: 1 PersAutoDict- DE: "\364\323\346\002\007\371\031\000"- 00:06:44.364 [2024-04-19 10:25:06.302229] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.364 [2024-04-19 10:25:06.302697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:006000f1 cdw11:e500e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.364 [2024-04-19 10:25:06.302724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.364 [2024-04-19 10:25:06.302817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a001a cdw11:00000a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.364 [2024-04-19 10:25:06.302835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.364 [2024-04-19 10:25:06.302927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.364 [2024-04-19 10:25:06.302946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.364 #44 NEW cov: 11908 ft: 14790 corp: 34/643b lim: 35 exec/s: 22 rss: 72Mb L: 21/34 MS: 1 EraseBytes- 00:06:44.364 #44 DONE cov: 11908 ft: 14790 corp: 34/643b lim: 35 exec/s: 22 rss: 72Mb 00:06:44.364 ###### Recommended dictionary. ###### 00:06:44.364 "\377\030\371\006}\313`\364" # Uses: 0 00:06:44.364 "\364\323\346\002\007\371\031\000" # Uses: 1 00:06:44.364 ###### End of recommended dictionary. ###### 00:06:44.364 Done 44 runs in 2 second(s) 00:06:44.364 10:25:06 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:06:44.364 10:25:06 -- ../common.sh@72 -- # (( i++ )) 00:06:44.364 10:25:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:44.364 10:25:06 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:06:44.364 10:25:06 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:06:44.364 10:25:06 -- nvmf/run.sh@24 -- # local timen=1 00:06:44.364 10:25:06 -- nvmf/run.sh@25 -- # local core=0x1 00:06:44.364 10:25:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:44.364 10:25:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:06:44.364 10:25:06 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:44.364 10:25:06 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:44.364 10:25:06 -- nvmf/run.sh@34 -- # printf %02d 3 00:06:44.364 10:25:06 -- nvmf/run.sh@34 -- # port=4403 00:06:44.364 10:25:06 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:44.364 10:25:06 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:06:44.364 10:25:06 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:44.364 10:25:06 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:44.364 10:25:06 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:44.364 10:25:06 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:06:44.624 [2024-04-19 10:25:06.477833] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:44.624 [2024-04-19 10:25:06.477898] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid201294 ] 00:06:44.624 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.624 [2024-04-19 10:25:06.664210] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.624 [2024-04-19 10:25:06.731555] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.885 [2024-04-19 10:25:06.790846] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:44.885 [2024-04-19 10:25:06.806982] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:06:44.885 INFO: Running with entropic power schedule (0xFF, 100). 00:06:44.885 INFO: Seed: 995623772 00:06:44.885 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:44.885 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:44.885 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:44.885 INFO: A corpus is not provided, starting from an empty corpus 00:06:44.885 #2 INITED exec/s: 0 rss: 63Mb 00:06:44.885 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:44.885 This may also happen if the target rejected all inputs we tried so far 00:06:44.885 [2024-04-19 10:25:06.878166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:44.885 [2024-04-19 10:25:06.878201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.145 NEW_FUNC[1/671]: 0x486d80 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:06:45.145 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:45.145 #17 NEW cov: 11765 ft: 11765 corp: 2/11b lim: 20 exec/s: 0 rss: 69Mb L: 10/10 MS: 5 CrossOver-ChangeByte-ChangeByte-CopyPart-CMP- DE: "G\000\000\000\000\000\000\000"- 00:06:45.405 NEW_FUNC[1/7]: 0x12b8950 in nvmf_transport_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:775 00:06:45.405 NEW_FUNC[2/7]: 0x12d9950 in nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3504 00:06:45.405 #18 NEW cov: 11994 ft: 12468 corp: 3/21b lim: 20 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:06:45.405 [2024-04-19 10:25:07.289311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:45.405 [2024-04-19 10:25:07.289350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.405 #19 NEW cov: 12000 ft: 12747 corp: 4/32b lim: 20 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 InsertByte- 00:06:45.405 [2024-04-19 10:25:07.339549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:45.405 [2024-04-19 10:25:07.339578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.405 #20 NEW cov: 12085 ft: 12966 corp: 5/42b lim: 20 exec/s: 0 rss: 69Mb L: 10/11 MS: 1 ChangeBit- 00:06:45.405 #21 NEW cov: 12085 ft: 13180 corp: 6/52b lim: 20 exec/s: 0 rss: 69Mb L: 10/11 MS: 1 ChangeBinInt- 00:06:45.405 #22 NEW cov: 12085 ft: 13259 corp: 7/60b lim: 20 exec/s: 0 rss: 70Mb L: 8/11 MS: 1 EraseBytes- 00:06:45.665 #23 NEW cov: 12102 ft: 13678 corp: 8/77b lim: 20 exec/s: 0 rss: 70Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:06:45.665 #24 NEW cov: 12102 ft: 13722 corp: 9/87b lim: 20 exec/s: 0 rss: 70Mb L: 10/17 MS: 1 ChangeByte- 00:06:45.665 #27 NEW cov: 12102 ft: 13753 corp: 10/96b lim: 20 exec/s: 0 rss: 70Mb L: 9/17 MS: 3 ShuffleBytes-ChangeBit-CMP- DE: "\377\377\377\377\377\377\377\377"- 00:06:45.665 #28 NEW cov: 12102 ft: 14057 corp: 11/102b lim: 20 exec/s: 0 rss: 70Mb L: 6/17 MS: 1 EraseBytes- 00:06:45.665 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:45.665 #29 NEW cov: 12125 ft: 14103 corp: 12/108b lim: 20 exec/s: 0 rss: 70Mb L: 6/17 MS: 1 ChangeByte- 00:06:45.925 #30 NEW cov: 12129 ft: 14204 corp: 13/122b lim: 20 exec/s: 0 rss: 70Mb L: 14/17 MS: 1 CMP- DE: "\001\031\371\007\332\311n\250"- 00:06:45.925 #31 NEW cov: 12129 ft: 14241 corp: 14/131b lim: 20 exec/s: 31 rss: 70Mb L: 9/17 MS: 1 InsertByte- 00:06:45.925 #32 NEW cov: 12129 ft: 14258 corp: 15/141b lim: 20 exec/s: 32 rss: 70Mb L: 10/17 MS: 1 CrossOver- 00:06:45.925 [2024-04-19 10:25:07.952046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:45.925 [2024-04-19 10:25:07.952078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.925 #33 NEW cov: 12129 ft: 14284 corp: 16/151b lim: 20 exec/s: 33 rss: 70Mb L: 10/17 MS: 1 ChangeBinInt- 00:06:45.925 #34 NEW cov: 12129 ft: 14300 corp: 17/158b lim: 20 exec/s: 34 rss: 70Mb L: 7/17 MS: 1 EraseBytes- 00:06:46.185 #35 NEW cov: 12129 ft: 14383 corp: 18/176b lim: 20 exec/s: 35 rss: 70Mb L: 18/18 MS: 1 InsertByte- 00:06:46.185 #36 NEW cov: 12129 ft: 14400 corp: 19/194b lim: 20 exec/s: 36 rss: 70Mb L: 18/18 MS: 1 InsertByte- 00:06:46.185 #37 NEW cov: 12129 ft: 14421 corp: 20/204b lim: 20 exec/s: 37 rss: 70Mb L: 10/18 MS: 1 ChangeByte- 00:06:46.185 #38 NEW cov: 12129 ft: 14439 corp: 21/221b lim: 20 exec/s: 38 rss: 70Mb L: 17/18 MS: 1 CopyPart- 00:06:46.446 #39 NEW cov: 12129 ft: 14529 corp: 22/231b lim: 20 exec/s: 39 rss: 71Mb L: 10/18 MS: 1 CrossOver- 00:06:46.446 #40 NEW cov: 12129 ft: 14568 corp: 23/240b lim: 20 exec/s: 40 rss: 71Mb L: 9/18 MS: 1 EraseBytes- 00:06:46.446 #41 NEW cov: 12129 ft: 14600 corp: 24/250b lim: 20 exec/s: 41 rss: 71Mb L: 10/18 MS: 1 CopyPart- 00:06:46.446 #42 NEW cov: 12129 ft: 14619 corp: 25/257b lim: 20 exec/s: 42 rss: 71Mb L: 7/18 MS: 1 ShuffleBytes- 00:06:46.446 [2024-04-19 10:25:08.514602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:46.446 [2024-04-19 10:25:08.514638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.446 #43 NEW cov: 12129 ft: 14691 corp: 26/267b lim: 20 exec/s: 43 rss: 71Mb L: 10/18 MS: 1 ChangeBinInt- 00:06:46.707 [2024-04-19 10:25:08.574670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:46.707 [2024-04-19 10:25:08.574699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.707 #44 NEW cov: 12129 ft: 14721 corp: 27/277b lim: 20 exec/s: 44 rss: 71Mb L: 10/18 MS: 1 ChangeBinInt- 00:06:46.707 #45 NEW cov: 12129 ft: 14740 corp: 28/288b lim: 20 exec/s: 45 rss: 72Mb L: 11/18 MS: 1 InsertByte- 00:06:46.707 #46 NEW cov: 12129 ft: 14791 corp: 29/294b lim: 20 exec/s: 46 rss: 72Mb L: 6/18 MS: 1 CopyPart- 00:06:46.707 #47 NEW cov: 12129 ft: 14813 corp: 30/304b lim: 20 exec/s: 47 rss: 72Mb L: 10/18 MS: 1 ChangeBit- 00:06:46.707 #48 NEW cov: 12129 ft: 14823 corp: 31/322b lim: 20 exec/s: 48 rss: 72Mb L: 18/18 MS: 1 ShuffleBytes- 00:06:46.968 #49 NEW cov: 12129 ft: 14845 corp: 32/339b lim: 20 exec/s: 24 rss: 72Mb L: 17/18 MS: 1 CopyPart- 00:06:46.968 #49 DONE cov: 12129 ft: 14845 corp: 32/339b lim: 20 exec/s: 24 rss: 72Mb 00:06:46.968 ###### Recommended dictionary. ###### 00:06:46.968 "G\000\000\000\000\000\000\000" # Uses: 0 00:06:46.968 "\377\377\377\377\377\377\377\377" # Uses: 0 00:06:46.968 "\001\031\371\007\332\311n\250" # Uses: 0 00:06:46.968 ###### End of recommended dictionary. ###### 00:06:46.968 Done 49 runs in 2 second(s) 00:06:46.968 10:25:08 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:06:46.968 10:25:08 -- ../common.sh@72 -- # (( i++ )) 00:06:46.968 10:25:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:46.968 10:25:08 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:06:46.969 10:25:08 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:06:46.969 10:25:08 -- nvmf/run.sh@24 -- # local timen=1 00:06:46.969 10:25:08 -- nvmf/run.sh@25 -- # local core=0x1 00:06:46.969 10:25:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:46.969 10:25:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:06:46.969 10:25:08 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:46.969 10:25:08 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:46.969 10:25:08 -- nvmf/run.sh@34 -- # printf %02d 4 00:06:46.969 10:25:08 -- nvmf/run.sh@34 -- # port=4404 00:06:46.969 10:25:08 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:46.969 10:25:09 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:06:46.969 10:25:09 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:46.969 10:25:09 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:46.969 10:25:09 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:46.969 10:25:09 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:06:46.969 [2024-04-19 10:25:09.033131] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:46.969 [2024-04-19 10:25:09.033223] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid201643 ] 00:06:46.969 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.229 [2024-04-19 10:25:09.212151] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.229 [2024-04-19 10:25:09.279252] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.229 [2024-04-19 10:25:09.338559] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:47.489 [2024-04-19 10:25:09.354680] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:06:47.489 INFO: Running with entropic power schedule (0xFF, 100). 00:06:47.489 INFO: Seed: 3544617868 00:06:47.489 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:47.489 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:47.489 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:47.489 INFO: A corpus is not provided, starting from an empty corpus 00:06:47.489 #2 INITED exec/s: 0 rss: 63Mb 00:06:47.489 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:47.489 This may also happen if the target rejected all inputs we tried so far 00:06:47.489 [2024-04-19 10:25:09.400080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:009f0200 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.489 [2024-04-19 10:25:09.400109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.750 NEW_FUNC[1/671]: 0x487e70 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:06:47.750 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:47.750 #5 NEW cov: 11676 ft: 11677 corp: 2/14b lim: 35 exec/s: 0 rss: 69Mb L: 13/13 MS: 3 CMP-ChangeByte-CMP- DE: "\002\000\000\000"-"\217n\353\272\010\371\031\000"- 00:06:47.750 [2024-04-19 10:25:09.731264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.750 [2024-04-19 10:25:09.731303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.750 [2024-04-19 10:25:09.731358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.750 [2024-04-19 10:25:09.731372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.750 [2024-04-19 10:25:09.731424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ebba8f6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.750 [2024-04-19 10:25:09.731437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.750 #6 NEW cov: 11806 ft: 12906 corp: 3/36b lim: 35 exec/s: 0 rss: 69Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:06:47.750 [2024-04-19 10:25:09.780970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0d000200 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.750 [2024-04-19 10:25:09.780996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.750 #7 NEW cov: 11812 ft: 13099 corp: 4/49b lim: 35 exec/s: 0 rss: 69Mb L: 13/22 MS: 1 ChangeBinInt- 00:06:47.750 [2024-04-19 10:25:09.821093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:09000200 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.750 [2024-04-19 10:25:09.821118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.750 #8 NEW cov: 11897 ft: 13400 corp: 5/62b lim: 35 exec/s: 0 rss: 70Mb L: 13/22 MS: 1 ChangeBit- 00:06:48.011 [2024-04-19 10:25:09.861540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:09.861566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.011 [2024-04-19 10:25:09.861620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:09.861638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.011 [2024-04-19 10:25:09.861691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ebba8f6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:09.861706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.011 #9 NEW cov: 11897 ft: 13516 corp: 6/84b lim: 35 exec/s: 0 rss: 70Mb L: 22/22 MS: 1 ShuffleBytes- 00:06:48.011 [2024-04-19 10:25:09.911661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:09.911686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.011 [2024-04-19 10:25:09.911739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:09.911754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.011 [2024-04-19 10:25:09.911807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ebba1c6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:09.911825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.011 #10 NEW cov: 11897 ft: 13632 corp: 7/106b lim: 35 exec/s: 0 rss: 70Mb L: 22/22 MS: 1 ChangeByte- 00:06:48.011 [2024-04-19 10:25:09.961644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:09.961668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.011 [2024-04-19 10:25:09.961722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:09000000 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:09.961735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.011 #11 NEW cov: 11897 ft: 13888 corp: 8/126b lim: 35 exec/s: 0 rss: 70Mb L: 20/22 MS: 1 InsertRepeatedBytes- 00:06:48.011 [2024-04-19 10:25:10.011970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:006e0200 cdw11:ebba0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:10.011996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.011 [2024-04-19 10:25:10.012049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:10.012063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.011 [2024-04-19 10:25:10.012115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0a8f009f cdw11:6eeb0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:10.012129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.011 #12 NEW cov: 11897 ft: 13998 corp: 9/151b lim: 35 exec/s: 0 rss: 70Mb L: 25/25 MS: 1 CrossOver- 00:06:48.011 [2024-04-19 10:25:10.051751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0d0200 cdw11:006e0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:10.051779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.011 #13 NEW cov: 11897 ft: 14155 corp: 10/164b lim: 35 exec/s: 0 rss: 70Mb L: 13/25 MS: 1 ShuffleBytes- 00:06:48.011 [2024-04-19 10:25:10.091908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000202 cdw11:008f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.011 [2024-04-19 10:25:10.091939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.011 #14 NEW cov: 11897 ft: 14178 corp: 11/177b lim: 35 exec/s: 0 rss: 70Mb L: 13/25 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:06:48.272 [2024-04-19 10:25:10.131957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000a0209 cdw11:8f6e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.131983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.272 #15 NEW cov: 11897 ft: 14328 corp: 12/189b lim: 35 exec/s: 0 rss: 70Mb L: 12/25 MS: 1 EraseBytes- 00:06:48.272 [2024-04-19 10:25:10.172603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:009f0200 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.172627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.172681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:08f8ebba cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.172695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.172750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.172765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.172821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.172835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.272 #16 NEW cov: 11897 ft: 14706 corp: 13/223b lim: 35 exec/s: 0 rss: 70Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:06:48.272 [2024-04-19 10:25:10.212363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.212388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.212445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6eeb0a8f cdw11:ba080003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.212460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.272 #17 NEW cov: 11897 ft: 14742 corp: 14/239b lim: 35 exec/s: 0 rss: 70Mb L: 16/34 MS: 1 EraseBytes- 00:06:48.272 [2024-04-19 10:25:10.252463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:9f0a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.252489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.252544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ba086eeb cdw11:f9190000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.252559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.272 #18 NEW cov: 11897 ft: 14744 corp: 15/253b lim: 35 exec/s: 0 rss: 70Mb L: 14/34 MS: 1 EraseBytes- 00:06:48.272 [2024-04-19 10:25:10.302931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:009f0200 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.302958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.303013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:08f8ebba cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.303027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.303082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.303097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.303151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.303166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.272 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:48.272 #19 NEW cov: 11920 ft: 14835 corp: 16/287b lim: 35 exec/s: 0 rss: 70Mb L: 34/34 MS: 1 ChangeBinInt- 00:06:48.272 [2024-04-19 10:25:10.352767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:09000200 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.352792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.272 [2024-04-19 10:25:10.352853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a1eba1a1 cdw11:ba080003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.272 [2024-04-19 10:25:10.352868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.272 #20 NEW cov: 11920 ft: 14889 corp: 17/303b lim: 35 exec/s: 0 rss: 70Mb L: 16/34 MS: 1 InsertRepeatedBytes- 00:06:48.534 [2024-04-19 10:25:10.393184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.393211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.534 [2024-04-19 10:25:10.393266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:09000000 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.393280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.534 [2024-04-19 10:25:10.393337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:08f9ebba cdw11:19100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.393351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.534 [2024-04-19 10:25:10.393404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:10101010 cdw11:10100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.393419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.534 #21 NEW cov: 11920 ft: 14909 corp: 18/334b lim: 35 exec/s: 21 rss: 70Mb L: 31/34 MS: 1 InsertRepeatedBytes- 00:06:48.534 [2024-04-19 10:25:10.443179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.443204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.534 [2024-04-19 10:25:10.443261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.443275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.534 [2024-04-19 10:25:10.443328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:6e8febba cdw11:f9080000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.443342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.534 #22 NEW cov: 11920 ft: 14945 corp: 19/356b lim: 35 exec/s: 22 rss: 70Mb L: 22/34 MS: 1 ShuffleBytes- 00:06:48.534 [2024-04-19 10:25:10.482987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0deb8f cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.483011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.534 #23 NEW cov: 11920 ft: 15022 corp: 20/369b lim: 35 exec/s: 23 rss: 71Mb L: 13/34 MS: 1 ShuffleBytes- 00:06:48.534 [2024-04-19 10:25:10.523102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ebba8f6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.523126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.534 #24 NEW cov: 11920 ft: 15085 corp: 21/378b lim: 35 exec/s: 24 rss: 71Mb L: 9/34 MS: 1 PersAutoDict- DE: "\217n\353\272\010\371\031\000"- 00:06:48.534 [2024-04-19 10:25:10.563165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:abba8f6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.563190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.534 #25 NEW cov: 11920 ft: 15168 corp: 22/387b lim: 35 exec/s: 25 rss: 71Mb L: 9/34 MS: 1 ChangeBit- 00:06:48.534 [2024-04-19 10:25:10.603660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.603684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.534 [2024-04-19 10:25:10.603756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00d40000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.603771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.534 [2024-04-19 10:25:10.603830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ebba8f6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.603844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.534 #26 NEW cov: 11920 ft: 15176 corp: 23/409b lim: 35 exec/s: 26 rss: 71Mb L: 22/34 MS: 1 ChangeByte- 00:06:48.534 [2024-04-19 10:25:10.643610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.643635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.534 [2024-04-19 10:25:10.643690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.534 [2024-04-19 10:25:10.643704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.795 #27 NEW cov: 11920 ft: 15186 corp: 24/429b lim: 35 exec/s: 27 rss: 71Mb L: 20/34 MS: 1 CrossOver- 00:06:48.795 [2024-04-19 10:25:10.683572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:abba8f6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.683596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.795 #30 NEW cov: 11920 ft: 15198 corp: 25/437b lim: 35 exec/s: 30 rss: 71Mb L: 8/34 MS: 3 ChangeBinInt-ShuffleBytes-CrossOver- 00:06:48.795 [2024-04-19 10:25:10.724011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.724035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.724095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff9f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.724109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.724163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ebba1c6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.724177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.795 #31 NEW cov: 11920 ft: 15212 corp: 26/459b lim: 35 exec/s: 31 rss: 71Mb L: 22/34 MS: 1 ChangeBinInt- 00:06:48.795 [2024-04-19 10:25:10.774498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:009f0200 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.774522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.774576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:08f8ebba cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.774589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.774642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.774656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.774709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.774723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.774777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:f802f8f8 cdw11:00f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.774791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:48.795 #32 NEW cov: 11920 ft: 15263 corp: 27/494b lim: 35 exec/s: 32 rss: 71Mb L: 35/35 MS: 1 CrossOver- 00:06:48.795 [2024-04-19 10:25:10.814285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.814312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.814367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:9f0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.814381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.814441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ba086eeb cdw11:f9190000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.814455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.795 #34 NEW cov: 11920 ft: 15273 corp: 28/516b lim: 35 exec/s: 34 rss: 71Mb L: 22/35 MS: 2 ChangeByte-CrossOver- 00:06:48.795 [2024-04-19 10:25:10.854342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.854367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.854424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.854438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.854494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ebba8f6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.854509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.795 #35 NEW cov: 11920 ft: 15296 corp: 29/538b lim: 35 exec/s: 35 rss: 71Mb L: 22/35 MS: 1 ShuffleBytes- 00:06:48.795 [2024-04-19 10:25:10.894555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.894579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.894634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:09000000 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.894648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.795 [2024-04-19 10:25:10.894704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:08f97eba cdw11:19100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.795 [2024-04-19 10:25:10.894718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.796 [2024-04-19 10:25:10.894773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:10101010 cdw11:10100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.796 [2024-04-19 10:25:10.894787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.057 #36 NEW cov: 11920 ft: 15324 corp: 30/569b lim: 35 exec/s: 36 rss: 71Mb L: 31/35 MS: 1 ChangeByte- 00:06:49.057 [2024-04-19 10:25:10.944603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:10.944628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:10.944684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:10.944698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:10.944755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ebba1c6e cdw11:05f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:10.944769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.057 #37 NEW cov: 11920 ft: 15366 corp: 31/591b lim: 35 exec/s: 37 rss: 72Mb L: 22/35 MS: 1 ChangeBinInt- 00:06:49.057 [2024-04-19 10:25:10.984610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:10.984635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:10.984690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:10.984704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:10.984760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ebba8f6e cdw11:08f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:10.984774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.057 #38 NEW cov: 11920 ft: 15383 corp: 32/613b lim: 35 exec/s: 38 rss: 72Mb L: 22/35 MS: 1 ShuffleBytes- 00:06:49.057 [2024-04-19 10:25:11.024892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.024916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.024971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:09000000 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.024985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.025040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:08f9ebba cdw11:19100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.025054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.025109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:10101010 cdw11:14100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.025122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.057 #39 NEW cov: 11920 ft: 15397 corp: 33/644b lim: 35 exec/s: 39 rss: 72Mb L: 31/35 MS: 1 ChangeBit- 00:06:49.057 [2024-04-19 10:25:11.064702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.064727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.064784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6e080a8f cdw11:baeb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.064798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.057 #40 NEW cov: 11920 ft: 15415 corp: 34/660b lim: 35 exec/s: 40 rss: 72Mb L: 16/35 MS: 1 ShuffleBytes- 00:06:49.057 [2024-04-19 10:25:11.115201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:19100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.115225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.115282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00090000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.115298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.115354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:6e7e0a8f cdw11:ba080003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.115368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.115424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:10101910 cdw11:10100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.115437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.057 #41 NEW cov: 11920 ft: 15420 corp: 35/694b lim: 35 exec/s: 41 rss: 72Mb L: 34/35 MS: 1 CopyPart- 00:06:49.057 [2024-04-19 10:25:11.165353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:84000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.165378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.165434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:09000000 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.057 [2024-04-19 10:25:11.165448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.057 [2024-04-19 10:25:11.165503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:08f97eba cdw11:19100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.058 [2024-04-19 10:25:11.165517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.058 [2024-04-19 10:25:11.165570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:10101010 cdw11:10100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.058 [2024-04-19 10:25:11.165585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.318 #42 NEW cov: 11920 ft: 15425 corp: 36/725b lim: 35 exec/s: 42 rss: 72Mb L: 31/35 MS: 1 ChangeByte- 00:06:49.318 [2024-04-19 10:25:11.205251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.205275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.318 [2024-04-19 10:25:11.205332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:9f0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.205346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.318 [2024-04-19 10:25:11.205400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ba056eeb cdw11:f9190000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.205414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.318 #43 NEW cov: 11920 ft: 15434 corp: 37/746b lim: 35 exec/s: 43 rss: 72Mb L: 21/35 MS: 1 EraseBytes- 00:06:49.318 [2024-04-19 10:25:11.255699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:009f0200 cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.255722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.318 [2024-04-19 10:25:11.255780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:08f8ebba cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.255797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.318 [2024-04-19 10:25:11.255823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.255849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.318 [2024-04-19 10:25:11.255905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.255920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.318 [2024-04-19 10:25:11.255977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:f802f8f8 cdw11:00080000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.255990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:49.318 #44 NEW cov: 11920 ft: 15479 corp: 38/781b lim: 35 exec/s: 44 rss: 72Mb L: 35/35 MS: 1 CopyPart- 00:06:49.318 [2024-04-19 10:25:11.305565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.318 [2024-04-19 10:25:11.305589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.318 [2024-04-19 10:25:11.305644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:009f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.319 [2024-04-19 10:25:11.305657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.319 [2024-04-19 10:25:11.305714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:05028f6e cdw11:000d0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.319 [2024-04-19 10:25:11.305727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.319 #45 NEW cov: 11920 ft: 15488 corp: 39/802b lim: 35 exec/s: 45 rss: 72Mb L: 21/35 MS: 1 InsertByte- 00:06:49.319 [2024-04-19 10:25:11.355517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a1a1026e cdw11:0a8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.319 [2024-04-19 10:25:11.355542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.319 [2024-04-19 10:25:11.355597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a1eba1a1 cdw11:ba080003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.319 [2024-04-19 10:25:11.355611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.319 #46 NEW cov: 11920 ft: 15492 corp: 40/818b lim: 35 exec/s: 46 rss: 72Mb L: 16/35 MS: 1 CopyPart- 00:06:49.319 [2024-04-19 10:25:11.405820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:f9f9f9f9 cdw11:f9f90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.319 [2024-04-19 10:25:11.405846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.319 [2024-04-19 10:25:11.405901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:f9f9f9f9 cdw11:f9f90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.319 [2024-04-19 10:25:11.405914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.319 [2024-04-19 10:25:11.405972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f9f9f9f9 cdw11:f9f90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.319 [2024-04-19 10:25:11.405988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.319 #47 NEW cov: 11920 ft: 15509 corp: 41/841b lim: 35 exec/s: 23 rss: 72Mb L: 23/35 MS: 1 InsertRepeatedBytes- 00:06:49.319 #47 DONE cov: 11920 ft: 15509 corp: 41/841b lim: 35 exec/s: 23 rss: 72Mb 00:06:49.319 ###### Recommended dictionary. ###### 00:06:49.319 "\002\000\000\000" # Uses: 1 00:06:49.319 "\217n\353\272\010\371\031\000" # Uses: 1 00:06:49.319 ###### End of recommended dictionary. ###### 00:06:49.319 Done 47 runs in 2 second(s) 00:06:49.580 10:25:11 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:06:49.580 10:25:11 -- ../common.sh@72 -- # (( i++ )) 00:06:49.580 10:25:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:49.580 10:25:11 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:06:49.580 10:25:11 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:06:49.580 10:25:11 -- nvmf/run.sh@24 -- # local timen=1 00:06:49.580 10:25:11 -- nvmf/run.sh@25 -- # local core=0x1 00:06:49.580 10:25:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:49.580 10:25:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:06:49.580 10:25:11 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:49.580 10:25:11 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:49.580 10:25:11 -- nvmf/run.sh@34 -- # printf %02d 5 00:06:49.580 10:25:11 -- nvmf/run.sh@34 -- # port=4405 00:06:49.580 10:25:11 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:49.580 10:25:11 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:06:49.580 10:25:11 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:49.580 10:25:11 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:49.580 10:25:11 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:49.580 10:25:11 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:06:49.580 [2024-04-19 10:25:11.584393] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:49.580 [2024-04-19 10:25:11.584460] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202000 ] 00:06:49.580 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.840 [2024-04-19 10:25:11.763851] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.840 [2024-04-19 10:25:11.831765] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.840 [2024-04-19 10:25:11.890725] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:49.840 [2024-04-19 10:25:11.906865] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:06:49.840 INFO: Running with entropic power schedule (0xFF, 100). 00:06:49.840 INFO: Seed: 1800653514 00:06:49.840 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:49.840 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:49.840 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:49.840 INFO: A corpus is not provided, starting from an empty corpus 00:06:49.840 #2 INITED exec/s: 0 rss: 63Mb 00:06:49.840 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:49.840 This may also happen if the target rejected all inputs we tried so far 00:06:50.100 [2024-04-19 10:25:11.956201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.100 [2024-04-19 10:25:11.956232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.360 NEW_FUNC[1/670]: 0x48a000 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:06:50.360 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:50.360 #21 NEW cov: 11679 ft: 11677 corp: 2/16b lim: 45 exec/s: 0 rss: 69Mb L: 15/15 MS: 4 ChangeBit-ChangeBit-InsertByte-InsertRepeatedBytes- 00:06:50.360 [2024-04-19 10:25:12.318579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.360 [2024-04-19 10:25:12.318620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.360 NEW_FUNC[1/1]: 0xf7d690 in posix_sock_readv /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1553 00:06:50.360 #22 NEW cov: 11817 ft: 12244 corp: 3/26b lim: 45 exec/s: 0 rss: 69Mb L: 10/15 MS: 1 EraseBytes- 00:06:50.360 [2024-04-19 10:25:12.378732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.360 [2024-04-19 10:25:12.378764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.360 #23 NEW cov: 11823 ft: 12518 corp: 4/42b lim: 45 exec/s: 0 rss: 69Mb L: 16/16 MS: 1 InsertByte- 00:06:50.360 [2024-04-19 10:25:12.428909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.360 [2024-04-19 10:25:12.428936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.360 #24 NEW cov: 11908 ft: 12728 corp: 5/58b lim: 45 exec/s: 0 rss: 70Mb L: 16/16 MS: 1 ChangeBit- 00:06:50.620 [2024-04-19 10:25:12.489083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.489110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.620 #30 NEW cov: 11908 ft: 12836 corp: 6/68b lim: 45 exec/s: 0 rss: 70Mb L: 10/16 MS: 1 ShuffleBytes- 00:06:50.620 [2024-04-19 10:25:12.549238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.549264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.620 #31 NEW cov: 11908 ft: 12960 corp: 7/84b lim: 45 exec/s: 0 rss: 70Mb L: 16/16 MS: 1 CrossOver- 00:06:50.620 [2024-04-19 10:25:12.600566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.600591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.620 [2024-04-19 10:25:12.600686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.600702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.620 [2024-04-19 10:25:12.600787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.600803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.620 [2024-04-19 10:25:12.600901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:75757575 cdw11:75ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.600915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.620 #32 NEW cov: 11908 ft: 13877 corp: 8/125b lim: 45 exec/s: 0 rss: 70Mb L: 41/41 MS: 1 InsertRepeatedBytes- 00:06:50.620 [2024-04-19 10:25:12.659331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.659361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.620 #33 NEW cov: 11908 ft: 13934 corp: 9/141b lim: 45 exec/s: 0 rss: 70Mb L: 16/41 MS: 1 ChangeBit- 00:06:50.620 [2024-04-19 10:25:12.730179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.730213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.620 [2024-04-19 10:25:12.730326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3bffffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.620 [2024-04-19 10:25:12.730346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.880 #34 NEW cov: 11908 ft: 14367 corp: 10/161b lim: 45 exec/s: 0 rss: 70Mb L: 20/41 MS: 1 CMP- DE: "\001\002\000\000"- 00:06:50.880 [2024-04-19 10:25:12.789979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:efff2eff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.880 [2024-04-19 10:25:12.790013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.880 #35 NEW cov: 11908 ft: 14533 corp: 11/172b lim: 45 exec/s: 0 rss: 70Mb L: 11/41 MS: 1 EraseBytes- 00:06:50.880 [2024-04-19 10:25:12.850383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.880 [2024-04-19 10:25:12.850409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.880 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:50.880 #36 NEW cov: 11931 ft: 14572 corp: 12/182b lim: 45 exec/s: 0 rss: 70Mb L: 10/41 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:06:50.880 [2024-04-19 10:25:12.910542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ff3b0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.880 [2024-04-19 10:25:12.910570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.880 #37 NEW cov: 11931 ft: 14640 corp: 13/197b lim: 45 exec/s: 0 rss: 70Mb L: 15/41 MS: 1 CrossOver- 00:06:50.880 [2024-04-19 10:25:12.960762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:efff2eff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.880 [2024-04-19 10:25:12.960787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.880 #38 NEW cov: 11931 ft: 14679 corp: 14/208b lim: 45 exec/s: 38 rss: 70Mb L: 11/41 MS: 1 ChangeBinInt- 00:06:51.141 [2024-04-19 10:25:13.021404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.021430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.141 [2024-04-19 10:25:13.021523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffff3b cdw11:01020007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.021539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.141 #39 NEW cov: 11931 ft: 14789 corp: 15/228b lim: 45 exec/s: 39 rss: 70Mb L: 20/41 MS: 1 CrossOver- 00:06:51.141 [2024-04-19 10:25:13.081605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.081632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.141 [2024-04-19 10:25:13.081723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.081740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.141 #40 NEW cov: 11931 ft: 14805 corp: 16/246b lim: 45 exec/s: 40 rss: 70Mb L: 18/41 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:06:51.141 [2024-04-19 10:25:13.131370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.131395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.141 #46 NEW cov: 11931 ft: 14857 corp: 17/258b lim: 45 exec/s: 46 rss: 70Mb L: 12/41 MS: 1 EraseBytes- 00:06:51.141 [2024-04-19 10:25:13.181882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.181907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.141 [2024-04-19 10:25:13.182011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffff3b cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.182027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.141 #47 NEW cov: 11931 ft: 14884 corp: 18/278b lim: 45 exec/s: 47 rss: 71Mb L: 20/41 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:06:51.141 [2024-04-19 10:25:13.242408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00005500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.242434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.141 [2024-04-19 10:25:13.242526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.242542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.141 [2024-04-19 10:25:13.242625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.141 [2024-04-19 10:25:13.242642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.404 #50 NEW cov: 11931 ft: 15134 corp: 19/312b lim: 45 exec/s: 50 rss: 71Mb L: 34/41 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:06:51.404 [2024-04-19 10:25:13.293385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.293410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.404 [2024-04-19 10:25:13.293499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.293515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.404 [2024-04-19 10:25:13.293605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.293626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.404 [2024-04-19 10:25:13.293716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:75750075 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.293731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.404 [2024-04-19 10:25:13.293824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:3bff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.293841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.404 #51 NEW cov: 11931 ft: 15204 corp: 20/357b lim: 45 exec/s: 51 rss: 71Mb L: 45/45 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:06:51.404 [2024-04-19 10:25:13.352562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.352586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.404 [2024-04-19 10:25:13.352680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffff9cff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.352695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.404 #52 NEW cov: 11931 ft: 15211 corp: 21/379b lim: 45 exec/s: 52 rss: 71Mb L: 22/45 MS: 1 InsertRepeatedBytes- 00:06:51.404 [2024-04-19 10:25:13.402421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:efff2eff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.402446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.404 #53 NEW cov: 11931 ft: 15238 corp: 22/390b lim: 45 exec/s: 53 rss: 71Mb L: 11/45 MS: 1 ChangeByte- 00:06:51.404 [2024-04-19 10:25:13.452656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.452683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.404 #54 NEW cov: 11931 ft: 15259 corp: 23/403b lim: 45 exec/s: 54 rss: 71Mb L: 13/45 MS: 1 EraseBytes- 00:06:51.404 [2024-04-19 10:25:13.502688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fbff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.404 [2024-04-19 10:25:13.502717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.664 #55 NEW cov: 11931 ft: 15278 corp: 24/413b lim: 45 exec/s: 55 rss: 71Mb L: 10/45 MS: 1 ChangeBit- 00:06:51.664 [2024-04-19 10:25:13.563040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:002e2eff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.664 [2024-04-19 10:25:13.563067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.664 #58 NEW cov: 11931 ft: 15298 corp: 25/422b lim: 45 exec/s: 58 rss: 71Mb L: 9/45 MS: 3 EraseBytes-ChangeBit-CopyPart- 00:06:51.664 [2024-04-19 10:25:13.623313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.664 [2024-04-19 10:25:13.623340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.664 #59 NEW cov: 11931 ft: 15309 corp: 26/432b lim: 45 exec/s: 59 rss: 71Mb L: 10/45 MS: 1 ChangeBit- 00:06:51.664 [2024-04-19 10:25:13.673485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.664 [2024-04-19 10:25:13.673513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.664 #60 NEW cov: 11931 ft: 15376 corp: 27/448b lim: 45 exec/s: 60 rss: 71Mb L: 16/45 MS: 1 CopyPart- 00:06:51.664 [2024-04-19 10:25:13.723719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff012eff cdw11:05ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.664 [2024-04-19 10:25:13.723746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.664 #61 NEW cov: 11931 ft: 15382 corp: 28/458b lim: 45 exec/s: 61 rss: 71Mb L: 10/45 MS: 1 ChangeBinInt- 00:06:51.925 [2024-04-19 10:25:13.783966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:efff2eff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.925 [2024-04-19 10:25:13.783995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.925 #62 NEW cov: 11931 ft: 15387 corp: 29/473b lim: 45 exec/s: 62 rss: 72Mb L: 15/45 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:06:51.925 [2024-04-19 10:25:13.844327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000102 cdw11:2eff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.925 [2024-04-19 10:25:13.844354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.925 #63 NEW cov: 11931 ft: 15416 corp: 30/489b lim: 45 exec/s: 63 rss: 72Mb L: 16/45 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:06:51.925 [2024-04-19 10:25:13.904841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.925 [2024-04-19 10:25:13.904865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.925 [2024-04-19 10:25:13.904956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffff3b cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.926 [2024-04-19 10:25:13.904974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.926 #64 NEW cov: 11931 ft: 15445 corp: 31/509b lim: 45 exec/s: 64 rss: 72Mb L: 20/45 MS: 1 ChangeBinInt- 00:06:51.926 [2024-04-19 10:25:13.965162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff2eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.926 [2024-04-19 10:25:13.965188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.926 [2024-04-19 10:25:13.965285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:01023bff cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.926 [2024-04-19 10:25:13.965301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.926 #65 NEW cov: 11931 ft: 15454 corp: 32/529b lim: 45 exec/s: 32 rss: 72Mb L: 20/45 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:06:51.926 #65 DONE cov: 11931 ft: 15454 corp: 32/529b lim: 45 exec/s: 32 rss: 72Mb 00:06:51.926 ###### Recommended dictionary. ###### 00:06:51.926 "\001\002\000\000" # Uses: 4 00:06:51.926 "\377\377\377\377\377\377\377\377" # Uses: 2 00:06:51.926 ###### End of recommended dictionary. ###### 00:06:51.926 Done 65 runs in 2 second(s) 00:06:52.187 10:25:14 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:06:52.187 10:25:14 -- ../common.sh@72 -- # (( i++ )) 00:06:52.187 10:25:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:52.187 10:25:14 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:06:52.187 10:25:14 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:06:52.187 10:25:14 -- nvmf/run.sh@24 -- # local timen=1 00:06:52.187 10:25:14 -- nvmf/run.sh@25 -- # local core=0x1 00:06:52.187 10:25:14 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:52.187 10:25:14 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:06:52.187 10:25:14 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:52.187 10:25:14 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:52.187 10:25:14 -- nvmf/run.sh@34 -- # printf %02d 6 00:06:52.187 10:25:14 -- nvmf/run.sh@34 -- # port=4406 00:06:52.187 10:25:14 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:52.187 10:25:14 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:06:52.187 10:25:14 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:52.187 10:25:14 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:52.187 10:25:14 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:52.187 10:25:14 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:06:52.187 [2024-04-19 10:25:14.153867] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:52.187 [2024-04-19 10:25:14.153955] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202348 ] 00:06:52.187 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.448 [2024-04-19 10:25:14.331844] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.448 [2024-04-19 10:25:14.399462] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.448 [2024-04-19 10:25:14.458420] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:52.448 [2024-04-19 10:25:14.474581] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:06:52.448 INFO: Running with entropic power schedule (0xFF, 100). 00:06:52.448 INFO: Seed: 72705717 00:06:52.448 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:52.448 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:52.448 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:52.448 INFO: A corpus is not provided, starting from an empty corpus 00:06:52.448 #2 INITED exec/s: 0 rss: 62Mb 00:06:52.448 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:52.448 This may also happen if the target rejected all inputs we tried so far 00:06:52.448 [2024-04-19 10:25:14.523149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000410a cdw11:00000000 00:06:52.448 [2024-04-19 10:25:14.523177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.708 NEW_FUNC[1/669]: 0x48c810 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:06:52.708 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:52.708 #3 NEW cov: 11604 ft: 11605 corp: 2/3b lim: 10 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 InsertByte- 00:06:52.969 [2024-04-19 10:25:14.833992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004177 cdw11:00000000 00:06:52.969 [2024-04-19 10:25:14.834029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.969 #4 NEW cov: 11734 ft: 12077 corp: 3/5b lim: 10 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 ChangeByte- 00:06:52.969 [2024-04-19 10:25:14.884011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004177 cdw11:00000000 00:06:52.969 [2024-04-19 10:25:14.884039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.969 #5 NEW cov: 11740 ft: 12305 corp: 4/8b lim: 10 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:06:52.969 [2024-04-19 10:25:14.924100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004141 cdw11:00000000 00:06:52.969 [2024-04-19 10:25:14.924124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.969 #6 NEW cov: 11825 ft: 12683 corp: 5/10b lim: 10 exec/s: 0 rss: 70Mb L: 2/3 MS: 1 CrossOver- 00:06:52.969 [2024-04-19 10:25:14.964238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004177 cdw11:00000000 00:06:52.969 [2024-04-19 10:25:14.964262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.969 #7 NEW cov: 11825 ft: 12794 corp: 6/13b lim: 10 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 ShuffleBytes- 00:06:52.969 [2024-04-19 10:25:15.004315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004177 cdw11:00000000 00:06:52.969 [2024-04-19 10:25:15.004340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.969 #8 NEW cov: 11825 ft: 12890 corp: 7/16b lim: 10 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 CopyPart- 00:06:52.969 [2024-04-19 10:25:15.044431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004141 cdw11:00000000 00:06:52.969 [2024-04-19 10:25:15.044455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.969 #10 NEW cov: 11825 ft: 12925 corp: 8/18b lim: 10 exec/s: 0 rss: 70Mb L: 2/3 MS: 2 EraseBytes-CopyPart- 00:06:53.230 [2024-04-19 10:25:15.084552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004141 cdw11:00000000 00:06:53.230 [2024-04-19 10:25:15.084576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.230 #11 NEW cov: 11825 ft: 13088 corp: 9/20b lim: 10 exec/s: 0 rss: 70Mb L: 2/3 MS: 1 CopyPart- 00:06:53.230 [2024-04-19 10:25:15.124667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004177 cdw11:00000000 00:06:53.230 [2024-04-19 10:25:15.124692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.230 #12 NEW cov: 11825 ft: 13129 corp: 10/23b lim: 10 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 ChangeBit- 00:06:53.230 [2024-04-19 10:25:15.164758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007751 cdw11:00000000 00:06:53.230 [2024-04-19 10:25:15.164783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.230 #13 NEW cov: 11825 ft: 13233 corp: 11/25b lim: 10 exec/s: 0 rss: 70Mb L: 2/3 MS: 1 EraseBytes- 00:06:53.230 [2024-04-19 10:25:15.204908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004041 cdw11:00000000 00:06:53.230 [2024-04-19 10:25:15.204932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.230 #14 NEW cov: 11825 ft: 13279 corp: 12/27b lim: 10 exec/s: 0 rss: 70Mb L: 2/3 MS: 1 ChangeByte- 00:06:53.230 [2024-04-19 10:25:15.245016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005541 cdw11:00000000 00:06:53.230 [2024-04-19 10:25:15.245040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.230 #17 NEW cov: 11825 ft: 13334 corp: 13/29b lim: 10 exec/s: 0 rss: 70Mb L: 2/3 MS: 3 EraseBytes-ShuffleBytes-InsertByte- 00:06:53.230 [2024-04-19 10:25:15.285118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003241 cdw11:00000000 00:06:53.230 [2024-04-19 10:25:15.285145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.230 #18 NEW cov: 11825 ft: 13374 corp: 14/32b lim: 10 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 InsertByte- 00:06:53.230 [2024-04-19 10:25:15.325246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002441 cdw11:00000000 00:06:53.230 [2024-04-19 10:25:15.325271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 #19 NEW cov: 11825 ft: 13385 corp: 15/35b lim: 10 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 InsertByte- 00:06:53.491 [2024-04-19 10:25:15.365370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007751 cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.365394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 #21 NEW cov: 11825 ft: 13432 corp: 16/38b lim: 10 exec/s: 0 rss: 70Mb L: 3/3 MS: 2 EraseBytes-CrossOver- 00:06:53.491 [2024-04-19 10:25:15.405624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007eaf cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.405648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 [2024-04-19 10:25:15.405699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000afaf cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.405713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.491 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:53.491 #25 NEW cov: 11848 ft: 13659 corp: 17/42b lim: 10 exec/s: 0 rss: 71Mb L: 4/4 MS: 4 EraseBytes-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:06:53.491 [2024-04-19 10:25:15.445608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005141 cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.445632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 #26 NEW cov: 11848 ft: 13724 corp: 18/45b lim: 10 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 ShuffleBytes- 00:06:53.491 [2024-04-19 10:25:15.486085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000077dc cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.486110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 [2024-04-19 10:25:15.486161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000dcdc cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.486175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.491 [2024-04-19 10:25:15.486226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000dcdc cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.486239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.491 [2024-04-19 10:25:15.486286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005140 cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.486299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:53.491 #27 NEW cov: 11848 ft: 14007 corp: 19/53b lim: 10 exec/s: 27 rss: 71Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:06:53.491 [2024-04-19 10:25:15.525837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007751 cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.525861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 #28 NEW cov: 11848 ft: 14033 corp: 20/55b lim: 10 exec/s: 28 rss: 71Mb L: 2/8 MS: 1 ShuffleBytes- 00:06:53.491 [2024-04-19 10:25:15.565938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000403d cdw11:00000000 00:06:53.491 [2024-04-19 10:25:15.565961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 #29 NEW cov: 11848 ft: 14063 corp: 21/57b lim: 10 exec/s: 29 rss: 71Mb L: 2/8 MS: 1 ChangeByte- 00:06:53.752 [2024-04-19 10:25:15.606071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009e5b cdw11:00000000 00:06:53.752 [2024-04-19 10:25:15.606096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.752 #33 NEW cov: 11848 ft: 14152 corp: 22/59b lim: 10 exec/s: 33 rss: 71Mb L: 2/8 MS: 4 EraseBytes-CrossOver-ChangeByte-InsertByte- 00:06:53.752 [2024-04-19 10:25:15.646191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009f41 cdw11:00000000 00:06:53.752 [2024-04-19 10:25:15.646215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.752 #34 NEW cov: 11848 ft: 14201 corp: 23/61b lim: 10 exec/s: 34 rss: 72Mb L: 2/8 MS: 1 ChangeByte- 00:06:53.752 [2024-04-19 10:25:15.686284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004941 cdw11:00000000 00:06:53.752 [2024-04-19 10:25:15.686309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.752 #35 NEW cov: 11848 ft: 14221 corp: 24/63b lim: 10 exec/s: 35 rss: 72Mb L: 2/8 MS: 1 ChangeBinInt- 00:06:53.752 [2024-04-19 10:25:15.726529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000041b0 cdw11:00000000 00:06:53.752 [2024-04-19 10:25:15.726553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.752 [2024-04-19 10:25:15.726603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007741 cdw11:00000000 00:06:53.752 [2024-04-19 10:25:15.726617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.752 #36 NEW cov: 11848 ft: 14255 corp: 25/67b lim: 10 exec/s: 36 rss: 72Mb L: 4/8 MS: 1 InsertByte- 00:06:53.752 [2024-04-19 10:25:15.766505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004177 cdw11:00000000 00:06:53.752 [2024-04-19 10:25:15.766529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.752 #37 NEW cov: 11848 ft: 14278 corp: 26/69b lim: 10 exec/s: 37 rss: 72Mb L: 2/8 MS: 1 EraseBytes- 00:06:53.752 [2024-04-19 10:25:15.806665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007651 cdw11:00000000 00:06:53.752 [2024-04-19 10:25:15.806689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.752 #43 NEW cov: 11848 ft: 14293 corp: 27/71b lim: 10 exec/s: 43 rss: 72Mb L: 2/8 MS: 1 ChangeBit- 00:06:53.752 [2024-04-19 10:25:15.846757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004177 cdw11:00000000 00:06:53.752 [2024-04-19 10:25:15.846780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.013 #44 NEW cov: 11848 ft: 14334 corp: 28/74b lim: 10 exec/s: 44 rss: 72Mb L: 3/8 MS: 1 ChangeByte- 00:06:54.013 [2024-04-19 10:25:15.886836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000077d1 cdw11:00000000 00:06:54.013 [2024-04-19 10:25:15.886860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.013 #45 NEW cov: 11848 ft: 14346 corp: 29/76b lim: 10 exec/s: 45 rss: 72Mb L: 2/8 MS: 1 ChangeBit- 00:06:54.013 [2024-04-19 10:25:15.926938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005141 cdw11:00000000 00:06:54.013 [2024-04-19 10:25:15.926962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.013 #46 NEW cov: 11848 ft: 14368 corp: 30/78b lim: 10 exec/s: 46 rss: 72Mb L: 2/8 MS: 1 EraseBytes- 00:06:54.013 [2024-04-19 10:25:15.967436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000077dc cdw11:00000000 00:06:54.013 [2024-04-19 10:25:15.967460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.013 [2024-04-19 10:25:15.967512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000dcdc cdw11:00000000 00:06:54.013 [2024-04-19 10:25:15.967525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.013 [2024-04-19 10:25:15.967577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000dcdc cdw11:00000000 00:06:54.013 [2024-04-19 10:25:15.967590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.013 [2024-04-19 10:25:15.967640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005140 cdw11:00000000 00:06:54.013 [2024-04-19 10:25:15.967653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.013 #47 NEW cov: 11848 ft: 14383 corp: 31/86b lim: 10 exec/s: 47 rss: 72Mb L: 8/8 MS: 1 ShuffleBytes- 00:06:54.013 [2024-04-19 10:25:16.007168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004141 cdw11:00000000 00:06:54.013 [2024-04-19 10:25:16.007192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.013 #48 NEW cov: 11848 ft: 14394 corp: 32/88b lim: 10 exec/s: 48 rss: 72Mb L: 2/8 MS: 1 CopyPart- 00:06:54.013 [2024-04-19 10:25:16.047309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007702 cdw11:00000000 00:06:54.013 [2024-04-19 10:25:16.047333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.013 #49 NEW cov: 11848 ft: 14424 corp: 33/90b lim: 10 exec/s: 49 rss: 72Mb L: 2/8 MS: 1 ChangeBinInt- 00:06:54.013 [2024-04-19 10:25:16.087410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004177 cdw11:00000000 00:06:54.013 [2024-04-19 10:25:16.087434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.013 #50 NEW cov: 11848 ft: 14432 corp: 34/93b lim: 10 exec/s: 50 rss: 72Mb L: 3/8 MS: 1 ChangeByte- 00:06:54.013 [2024-04-19 10:25:16.117484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000412d cdw11:00000000 00:06:54.013 [2024-04-19 10:25:16.117508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 #51 NEW cov: 11848 ft: 14455 corp: 35/96b lim: 10 exec/s: 51 rss: 72Mb L: 3/8 MS: 1 InsertByte- 00:06:54.274 [2024-04-19 10:25:16.157596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c149 cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.157621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 #54 NEW cov: 11848 ft: 14470 corp: 36/99b lim: 10 exec/s: 54 rss: 72Mb L: 3/8 MS: 3 EraseBytes-ChangeBit-CrossOver- 00:06:54.274 [2024-04-19 10:25:16.198079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000513b cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.198104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 [2024-04-19 10:25:16.198158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003b3b cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.198171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.274 [2024-04-19 10:25:16.198238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003b3b cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.198252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.274 [2024-04-19 10:25:16.198305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00003b41 cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.198319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.274 #55 NEW cov: 11848 ft: 14478 corp: 37/108b lim: 10 exec/s: 55 rss: 72Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:06:54.274 [2024-04-19 10:25:16.238225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000041ff cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.238250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 [2024-04-19 10:25:16.238305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.238318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.274 [2024-04-19 10:25:16.238370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.238384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.274 [2024-04-19 10:25:16.238436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000b077 cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.238449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.274 #56 NEW cov: 11848 ft: 14495 corp: 38/117b lim: 10 exec/s: 56 rss: 72Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:06:54.274 [2024-04-19 10:25:16.277957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000490a cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.277981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 #57 NEW cov: 11848 ft: 14546 corp: 39/119b lim: 10 exec/s: 57 rss: 72Mb L: 2/9 MS: 1 ChangeBit- 00:06:54.274 [2024-04-19 10:25:16.318313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000077dc cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.318338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 [2024-04-19 10:25:16.318389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000dcdc cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.318402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.274 [2024-04-19 10:25:16.318453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005140 cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.318466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.274 #58 NEW cov: 11848 ft: 14697 corp: 40/125b lim: 10 exec/s: 58 rss: 72Mb L: 6/9 MS: 1 EraseBytes- 00:06:54.274 [2024-04-19 10:25:16.358189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000419f cdw11:00000000 00:06:54.274 [2024-04-19 10:25:16.358213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.536 #59 NEW cov: 11848 ft: 14707 corp: 41/127b lim: 10 exec/s: 59 rss: 72Mb L: 2/9 MS: 1 ShuffleBytes- 00:06:54.536 [2024-04-19 10:25:16.398526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007600 cdw11:00000000 00:06:54.536 [2024-04-19 10:25:16.398551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.536 [2024-04-19 10:25:16.398617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.536 [2024-04-19 10:25:16.398630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.536 [2024-04-19 10:25:16.398679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000051 cdw11:00000000 00:06:54.536 [2024-04-19 10:25:16.398692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.536 #60 NEW cov: 11848 ft: 14758 corp: 42/133b lim: 10 exec/s: 60 rss: 72Mb L: 6/9 MS: 1 InsertRepeatedBytes- 00:06:54.536 [2024-04-19 10:25:16.438426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff0c cdw11:00000000 00:06:54.536 [2024-04-19 10:25:16.438450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.536 #61 NEW cov: 11848 ft: 14762 corp: 43/135b lim: 10 exec/s: 61 rss: 72Mb L: 2/9 MS: 1 CMP- DE: "\377\014"- 00:06:54.536 [2024-04-19 10:25:16.468534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000600 cdw11:00000000 00:06:54.536 [2024-04-19 10:25:16.468558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.536 #62 NEW cov: 11848 ft: 14773 corp: 44/137b lim: 10 exec/s: 62 rss: 72Mb L: 2/9 MS: 1 CMP- DE: "\006\000"- 00:06:54.536 [2024-04-19 10:25:16.508649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a8a cdw11:00000000 00:06:54.536 [2024-04-19 10:25:16.508674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.536 #64 pulse cov: 11848 ft: 14781 corp: 44/137b lim: 10 exec/s: 32 rss: 72Mb 00:06:54.536 #64 NEW cov: 11848 ft: 14781 corp: 45/139b lim: 10 exec/s: 32 rss: 72Mb L: 2/9 MS: 2 ChangeBit-CopyPart- 00:06:54.536 #64 DONE cov: 11848 ft: 14781 corp: 45/139b lim: 10 exec/s: 32 rss: 72Mb 00:06:54.536 ###### Recommended dictionary. ###### 00:06:54.536 "\377\014" # Uses: 0 00:06:54.536 "\006\000" # Uses: 0 00:06:54.536 ###### End of recommended dictionary. ###### 00:06:54.536 Done 64 runs in 2 second(s) 00:06:54.797 10:25:16 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:06:54.797 10:25:16 -- ../common.sh@72 -- # (( i++ )) 00:06:54.797 10:25:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:54.797 10:25:16 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:06:54.797 10:25:16 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:06:54.797 10:25:16 -- nvmf/run.sh@24 -- # local timen=1 00:06:54.797 10:25:16 -- nvmf/run.sh@25 -- # local core=0x1 00:06:54.797 10:25:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:54.797 10:25:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:06:54.797 10:25:16 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:54.797 10:25:16 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:54.797 10:25:16 -- nvmf/run.sh@34 -- # printf %02d 7 00:06:54.797 10:25:16 -- nvmf/run.sh@34 -- # port=4407 00:06:54.797 10:25:16 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:54.797 10:25:16 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:06:54.797 10:25:16 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:54.797 10:25:16 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:54.797 10:25:16 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:54.797 10:25:16 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:06:54.797 [2024-04-19 10:25:16.688479] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:54.797 [2024-04-19 10:25:16.688551] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202701 ] 00:06:54.797 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.057 [2024-04-19 10:25:16.942297] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.057 [2024-04-19 10:25:17.025301] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.057 [2024-04-19 10:25:17.084198] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:55.057 [2024-04-19 10:25:17.100332] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:06:55.057 INFO: Running with entropic power schedule (0xFF, 100). 00:06:55.057 INFO: Seed: 2699677373 00:06:55.057 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:55.057 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:55.057 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:55.057 INFO: A corpus is not provided, starting from an empty corpus 00:06:55.057 #2 INITED exec/s: 0 rss: 62Mb 00:06:55.057 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:55.057 This may also happen if the target rejected all inputs we tried so far 00:06:55.057 [2024-04-19 10:25:17.145629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a15 cdw11:00000000 00:06:55.057 [2024-04-19 10:25:17.145657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.576 NEW_FUNC[1/669]: 0x48d200 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:06:55.576 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:55.576 #3 NEW cov: 11604 ft: 11605 corp: 2/4b lim: 10 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CMP- DE: "\025\001"- 00:06:55.576 [2024-04-19 10:25:17.477884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a6c cdw11:00000000 00:06:55.576 [2024-04-19 10:25:17.477931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.576 #6 NEW cov: 11734 ft: 12137 corp: 3/6b lim: 10 exec/s: 0 rss: 69Mb L: 2/3 MS: 3 ShuffleBytes-ChangeBit-InsertByte- 00:06:55.576 [2024-04-19 10:25:17.528093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a6c cdw11:00000000 00:06:55.576 [2024-04-19 10:25:17.528122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.576 #7 NEW cov: 11740 ft: 12497 corp: 4/8b lim: 10 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 ShuffleBytes- 00:06:55.576 [2024-04-19 10:25:17.588423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dfdf cdw11:00000000 00:06:55.576 [2024-04-19 10:25:17.588452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.576 #11 NEW cov: 11825 ft: 12770 corp: 5/10b lim: 10 exec/s: 0 rss: 69Mb L: 2/3 MS: 4 ChangeByte-ChangeBit-ShuffleBytes-CopyPart- 00:06:55.576 [2024-04-19 10:25:17.638843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bd93 cdw11:00000000 00:06:55.576 [2024-04-19 10:25:17.638871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.576 #12 NEW cov: 11825 ft: 12841 corp: 6/12b lim: 10 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 ChangeBinInt- 00:06:55.837 [2024-04-19 10:25:17.699217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fe93 cdw11:00000000 00:06:55.837 [2024-04-19 10:25:17.699244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.837 #13 NEW cov: 11825 ft: 12926 corp: 7/14b lim: 10 exec/s: 0 rss: 70Mb L: 2/3 MS: 1 ChangeByte- 00:06:55.837 [2024-04-19 10:25:17.759299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fe93 cdw11:00000000 00:06:55.837 [2024-04-19 10:25:17.759330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.837 #14 NEW cov: 11825 ft: 12976 corp: 8/17b lim: 10 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 CopyPart- 00:06:55.837 [2024-04-19 10:25:17.819463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001501 cdw11:00000000 00:06:55.837 [2024-04-19 10:25:17.819490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.837 #15 NEW cov: 11825 ft: 13051 corp: 9/19b lim: 10 exec/s: 0 rss: 70Mb L: 2/3 MS: 1 PersAutoDict- DE: "\025\001"- 00:06:55.837 [2024-04-19 10:25:17.869892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a15 cdw11:00000000 00:06:55.837 [2024-04-19 10:25:17.869918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.837 [2024-04-19 10:25:17.870000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000016c cdw11:00000000 00:06:55.837 [2024-04-19 10:25:17.870016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.837 #16 NEW cov: 11825 ft: 13278 corp: 10/23b lim: 10 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 PersAutoDict- DE: "\025\001"- 00:06:55.837 [2024-04-19 10:25:17.920165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001515 cdw11:00000000 00:06:55.837 [2024-04-19 10:25:17.920191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.837 [2024-04-19 10:25:17.920281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000101 cdw11:00000000 00:06:55.837 [2024-04-19 10:25:17.920298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.837 #17 NEW cov: 11825 ft: 13327 corp: 11/27b lim: 10 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 PersAutoDict- DE: "\025\001"- 00:06:56.133 [2024-04-19 10:25:17.980320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.133 [2024-04-19 10:25:17.980346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.133 [2024-04-19 10:25:17.980431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008101 cdw11:00000000 00:06:56.133 [2024-04-19 10:25:17.980448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.133 #18 NEW cov: 11825 ft: 13375 corp: 12/31b lim: 10 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 ChangeBit- 00:06:56.133 [2024-04-19 10:25:18.040343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006c6c cdw11:00000000 00:06:56.133 [2024-04-19 10:25:18.040370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.133 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:56.133 #21 NEW cov: 11848 ft: 13421 corp: 13/33b lim: 10 exec/s: 0 rss: 70Mb L: 2/4 MS: 3 EraseBytes-ShuffleBytes-CopyPart- 00:06:56.133 [2024-04-19 10:25:18.091041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bd06 cdw11:00000000 00:06:56.133 [2024-04-19 10:25:18.091066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.133 [2024-04-19 10:25:18.091154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:56.133 [2024-04-19 10:25:18.091169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.133 [2024-04-19 10:25:18.091259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000093 cdw11:00000000 00:06:56.133 [2024-04-19 10:25:18.091275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.133 #22 NEW cov: 11848 ft: 13661 corp: 14/39b lim: 10 exec/s: 0 rss: 70Mb L: 6/6 MS: 1 CMP- DE: "\006\000\000\000"- 00:06:56.133 [2024-04-19 10:25:18.140813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001501 cdw11:00000000 00:06:56.133 [2024-04-19 10:25:18.140838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.133 #23 NEW cov: 11848 ft: 13764 corp: 15/41b lim: 10 exec/s: 23 rss: 70Mb L: 2/6 MS: 1 EraseBytes- 00:06:56.134 [2024-04-19 10:25:18.201195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff6c cdw11:00000000 00:06:56.134 [2024-04-19 10:25:18.201221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.134 #26 NEW cov: 11848 ft: 13834 corp: 16/43b lim: 10 exec/s: 26 rss: 70Mb L: 2/6 MS: 3 EraseBytes-CopyPart-InsertByte- 00:06:56.438 [2024-04-19 10:25:18.251473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003281 cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.251500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.438 #28 NEW cov: 11848 ft: 13840 corp: 17/45b lim: 10 exec/s: 28 rss: 70Mb L: 2/6 MS: 2 CrossOver-InsertByte- 00:06:56.438 [2024-04-19 10:25:18.301793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.301823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.438 [2024-04-19 10:25:18.301911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000cb01 cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.301927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.438 #29 NEW cov: 11848 ft: 13871 corp: 18/49b lim: 10 exec/s: 29 rss: 71Mb L: 4/6 MS: 1 ChangeByte- 00:06:56.438 [2024-04-19 10:25:18.361794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006001 cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.361826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.438 #30 NEW cov: 11848 ft: 13893 corp: 19/51b lim: 10 exec/s: 30 rss: 71Mb L: 2/6 MS: 1 ChangeByte- 00:06:56.438 [2024-04-19 10:25:18.412157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fe93 cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.412183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.438 #31 NEW cov: 11848 ft: 13910 corp: 20/53b lim: 10 exec/s: 31 rss: 71Mb L: 2/6 MS: 1 EraseBytes- 00:06:56.438 [2024-04-19 10:25:18.463091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bd06 cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.463116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.438 [2024-04-19 10:25:18.463199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dfdf cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.463216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.438 [2024-04-19 10:25:18.463303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000093 cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.463319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.438 #32 NEW cov: 11848 ft: 13921 corp: 21/59b lim: 10 exec/s: 32 rss: 71Mb L: 6/6 MS: 1 CrossOver- 00:06:56.438 [2024-04-19 10:25:18.522831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fe93 cdw11:00000000 00:06:56.438 [2024-04-19 10:25:18.522859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.438 #33 NEW cov: 11848 ft: 13972 corp: 22/62b lim: 10 exec/s: 33 rss: 71Mb L: 3/6 MS: 1 ChangeBinInt- 00:06:56.715 [2024-04-19 10:25:18.573124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000026c cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.573150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.715 #34 NEW cov: 11848 ft: 13988 corp: 23/64b lim: 10 exec/s: 34 rss: 71Mb L: 2/6 MS: 1 ChangeBinInt- 00:06:56.715 [2024-04-19 10:25:18.624387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f139 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.624412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.624503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e75e cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.624519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.624605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000f206 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.624624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.624712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000e6ff cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.624728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.624814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00004a6c cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.624831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:56.715 #35 NEW cov: 11848 ft: 14236 corp: 24/74b lim: 10 exec/s: 35 rss: 71Mb L: 10/10 MS: 1 CMP- DE: "\3619\347^\362\006\346\377"- 00:06:56.715 [2024-04-19 10:25:18.674071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.674097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.674187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001581 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.674202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.674289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000101 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.674309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.715 #36 NEW cov: 11848 ft: 14246 corp: 25/80b lim: 10 exec/s: 36 rss: 71Mb L: 6/10 MS: 1 CrossOver- 00:06:56.715 [2024-04-19 10:25:18.723718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.723743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.715 #37 NEW cov: 11848 ft: 14252 corp: 26/82b lim: 10 exec/s: 37 rss: 72Mb L: 2/10 MS: 1 CopyPart- 00:06:56.715 [2024-04-19 10:25:18.774800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.774832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.774921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.774939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.775027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.775045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.715 [2024-04-19 10:25:18.775131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000cb01 cdw11:00000000 00:06:56.715 [2024-04-19 10:25:18.775151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.715 #38 NEW cov: 11848 ft: 14292 corp: 27/90b lim: 10 exec/s: 38 rss: 72Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:06:56.996 [2024-04-19 10:25:18.834344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000019d cdw11:00000000 00:06:56.996 [2024-04-19 10:25:18.834372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.996 #39 NEW cov: 11848 ft: 14323 corp: 28/92b lim: 10 exec/s: 39 rss: 72Mb L: 2/10 MS: 1 ChangeByte- 00:06:56.997 [2024-04-19 10:25:18.895510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:18.895541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.997 [2024-04-19 10:25:18.895646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:18.895663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.997 [2024-04-19 10:25:18.895757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008115 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:18.895776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.997 [2024-04-19 10:25:18.895871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001581 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:18.895889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.997 [2024-04-19 10:25:18.895971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000101 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:18.895989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:56.997 #40 NEW cov: 11848 ft: 14366 corp: 29/102b lim: 10 exec/s: 40 rss: 72Mb L: 10/10 MS: 1 CopyPart- 00:06:56.997 [2024-04-19 10:25:18.954737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000015d4 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:18.954769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.997 #41 NEW cov: 11848 ft: 14387 corp: 30/105b lim: 10 exec/s: 41 rss: 72Mb L: 3/10 MS: 1 InsertByte- 00:06:56.997 [2024-04-19 10:25:19.005214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:19.005244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.997 [2024-04-19 10:25:19.005335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:19.005352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.997 #42 NEW cov: 11848 ft: 14389 corp: 31/110b lim: 10 exec/s: 42 rss: 72Mb L: 5/10 MS: 1 EraseBytes- 00:06:56.997 [2024-04-19 10:25:19.065544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:19.065572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.997 [2024-04-19 10:25:19.065667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001515 cdw11:00000000 00:06:56.997 [2024-04-19 10:25:19.065685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.997 #43 NEW cov: 11848 ft: 14399 corp: 32/115b lim: 10 exec/s: 43 rss: 72Mb L: 5/10 MS: 1 CrossOver- 00:06:57.273 [2024-04-19 10:25:19.115734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fe93 cdw11:00000000 00:06:57.273 [2024-04-19 10:25:19.115762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.273 [2024-04-19 10:25:19.115873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff6c cdw11:00000000 00:06:57.273 [2024-04-19 10:25:19.115890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.273 #44 NEW cov: 11848 ft: 14445 corp: 33/120b lim: 10 exec/s: 22 rss: 72Mb L: 5/10 MS: 1 CrossOver- 00:06:57.273 #44 DONE cov: 11848 ft: 14445 corp: 33/120b lim: 10 exec/s: 22 rss: 72Mb 00:06:57.273 ###### Recommended dictionary. ###### 00:06:57.273 "\025\001" # Uses: 3 00:06:57.273 "\006\000\000\000" # Uses: 0 00:06:57.273 "\3619\347^\362\006\346\377" # Uses: 0 00:06:57.273 ###### End of recommended dictionary. ###### 00:06:57.273 Done 44 runs in 2 second(s) 00:06:57.273 10:25:19 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:06:57.273 10:25:19 -- ../common.sh@72 -- # (( i++ )) 00:06:57.273 10:25:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:57.273 10:25:19 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:06:57.273 10:25:19 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:06:57.273 10:25:19 -- nvmf/run.sh@24 -- # local timen=1 00:06:57.273 10:25:19 -- nvmf/run.sh@25 -- # local core=0x1 00:06:57.273 10:25:19 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:57.273 10:25:19 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:06:57.273 10:25:19 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:06:57.273 10:25:19 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:06:57.273 10:25:19 -- nvmf/run.sh@34 -- # printf %02d 8 00:06:57.273 10:25:19 -- nvmf/run.sh@34 -- # port=4408 00:06:57.273 10:25:19 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:57.273 10:25:19 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:06:57.274 10:25:19 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:57.274 10:25:19 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:06:57.274 10:25:19 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:06:57.274 10:25:19 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:06:57.274 [2024-04-19 10:25:19.317224] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:06:57.274 [2024-04-19 10:25:19.317290] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid203051 ] 00:06:57.274 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.561 [2024-04-19 10:25:19.569585] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.561 [2024-04-19 10:25:19.648961] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.832 [2024-04-19 10:25:19.708379] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:57.832 [2024-04-19 10:25:19.724515] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:06:57.832 INFO: Running with entropic power schedule (0xFF, 100). 00:06:57.832 INFO: Seed: 1029716337 00:06:57.832 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:06:57.832 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:06:57.832 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:57.832 INFO: A corpus is not provided, starting from an empty corpus 00:06:57.832 [2024-04-19 10:25:19.769333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.832 [2024-04-19 10:25:19.769366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.832 #2 INITED cov: 11632 ft: 11633 corp: 1/1b exec/s: 0 rss: 68Mb 00:06:57.832 [2024-04-19 10:25:19.819289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.832 [2024-04-19 10:25:19.819319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.832 #3 NEW cov: 11762 ft: 12159 corp: 2/2b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ShuffleBytes- 00:06:57.832 [2024-04-19 10:25:19.889462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.832 [2024-04-19 10:25:19.889492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.182 #4 NEW cov: 11768 ft: 12357 corp: 3/3b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ChangeByte- 00:06:58.182 [2024-04-19 10:25:19.959803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:19.959844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.182 [2024-04-19 10:25:19.959882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:19.959901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.182 #5 NEW cov: 11853 ft: 13322 corp: 4/5b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CrossOver- 00:06:58.182 [2024-04-19 10:25:20.030031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.030065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.182 [2024-04-19 10:25:20.030120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.030137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.182 [2024-04-19 10:25:20.030168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.030184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.182 #6 NEW cov: 11853 ft: 13561 corp: 5/8b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:06:58.182 [2024-04-19 10:25:20.100194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.100233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.182 [2024-04-19 10:25:20.100268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.100284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.182 [2024-04-19 10:25:20.100315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.100330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.182 #7 NEW cov: 11853 ft: 13739 corp: 6/11b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:06:58.182 [2024-04-19 10:25:20.160268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.160303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.182 [2024-04-19 10:25:20.160337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.160369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.182 #8 NEW cov: 11853 ft: 13802 corp: 7/13b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 InsertByte- 00:06:58.182 [2024-04-19 10:25:20.220424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.220457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.182 [2024-04-19 10:25:20.220492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.182 [2024-04-19 10:25:20.220508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.182 #9 NEW cov: 11853 ft: 13832 corp: 8/15b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 ChangeBinInt- 00:06:58.462 [2024-04-19 10:25:20.290617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.290650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.462 [2024-04-19 10:25:20.290685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.290705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.462 #10 NEW cov: 11853 ft: 13875 corp: 9/17b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 InsertByte- 00:06:58.462 [2024-04-19 10:25:20.340596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.340626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.462 #11 NEW cov: 11853 ft: 13923 corp: 10/18b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 ShuffleBytes- 00:06:58.462 [2024-04-19 10:25:20.390854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.390883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.462 [2024-04-19 10:25:20.390932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.390948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.462 [2024-04-19 10:25:20.390978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.390993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.462 #12 NEW cov: 11853 ft: 14020 corp: 11/21b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:06:58.462 [2024-04-19 10:25:20.440918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.440948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.462 [2024-04-19 10:25:20.440997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.441013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.462 #13 NEW cov: 11853 ft: 14039 corp: 12/23b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:06:58.462 [2024-04-19 10:25:20.491144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.491173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.462 [2024-04-19 10:25:20.491222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.491238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.462 [2024-04-19 10:25:20.491267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.491283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.462 #14 NEW cov: 11853 ft: 14060 corp: 13/26b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:06:58.462 [2024-04-19 10:25:20.561318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.561353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.462 [2024-04-19 10:25:20.561403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.561420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.462 [2024-04-19 10:25:20.561451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.462 [2024-04-19 10:25:20.561466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.770 #15 NEW cov: 11853 ft: 14068 corp: 14/29b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:06:58.770 [2024-04-19 10:25:20.631454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.770 [2024-04-19 10:25:20.631484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.770 [2024-04-19 10:25:20.631533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.770 [2024-04-19 10:25:20.631549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.057 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:59.057 #16 NEW cov: 11876 ft: 14103 corp: 15/31b lim: 5 exec/s: 16 rss: 70Mb L: 2/3 MS: 1 EraseBytes- 00:06:59.057 [2024-04-19 10:25:20.972628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:20.972675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.057 [2024-04-19 10:25:20.972726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:20.972741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.057 [2024-04-19 10:25:20.972772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:20.972787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.057 [2024-04-19 10:25:20.972826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:20.972842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:59.057 [2024-04-19 10:25:20.972872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:20.972888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:59.057 #17 NEW cov: 11876 ft: 14447 corp: 16/36b lim: 5 exec/s: 17 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:06:59.057 [2024-04-19 10:25:21.032592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:21.032624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.057 [2024-04-19 10:25:21.032677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:21.032693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.057 [2024-04-19 10:25:21.032723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:21.032739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.057 #18 NEW cov: 11876 ft: 14450 corp: 17/39b lim: 5 exec/s: 18 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:06:59.057 [2024-04-19 10:25:21.082586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:21.082615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.057 [2024-04-19 10:25:21.082663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:21.082678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.057 #19 NEW cov: 11876 ft: 14462 corp: 18/41b lim: 5 exec/s: 19 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:06:59.057 [2024-04-19 10:25:21.132651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.057 [2024-04-19 10:25:21.132691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.434 #20 NEW cov: 11876 ft: 14470 corp: 19/42b lim: 5 exec/s: 20 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:06:59.434 [2024-04-19 10:25:21.182853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.182883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.434 [2024-04-19 10:25:21.182932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.182948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.434 #21 NEW cov: 11876 ft: 14544 corp: 20/44b lim: 5 exec/s: 21 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:06:59.434 [2024-04-19 10:25:21.253003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.253036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.434 #22 NEW cov: 11876 ft: 14548 corp: 21/45b lim: 5 exec/s: 22 rss: 71Mb L: 1/5 MS: 1 ShuffleBytes- 00:06:59.434 [2024-04-19 10:25:21.323147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.323176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.434 #23 NEW cov: 11876 ft: 14561 corp: 22/46b lim: 5 exec/s: 23 rss: 71Mb L: 1/5 MS: 1 CopyPart- 00:06:59.434 [2024-04-19 10:25:21.373249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.373278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.434 #24 NEW cov: 11876 ft: 14562 corp: 23/47b lim: 5 exec/s: 24 rss: 71Mb L: 1/5 MS: 1 ChangeBit- 00:06:59.434 [2024-04-19 10:25:21.423463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.423492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.434 [2024-04-19 10:25:21.423541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.423557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.434 #25 NEW cov: 11876 ft: 14580 corp: 24/49b lim: 5 exec/s: 25 rss: 71Mb L: 2/5 MS: 1 EraseBytes- 00:06:59.434 [2024-04-19 10:25:21.493742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.493773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.434 [2024-04-19 10:25:21.493807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.493831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.434 [2024-04-19 10:25:21.493862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.434 [2024-04-19 10:25:21.493878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.434 #26 NEW cov: 11876 ft: 14634 corp: 25/52b lim: 5 exec/s: 26 rss: 71Mb L: 3/5 MS: 1 InsertByte- 00:06:59.745 [2024-04-19 10:25:21.543878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.543920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.745 [2024-04-19 10:25:21.543957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.543973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.745 [2024-04-19 10:25:21.544003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.544020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.745 #27 NEW cov: 11876 ft: 14640 corp: 26/55b lim: 5 exec/s: 27 rss: 71Mb L: 3/5 MS: 1 ShuffleBytes- 00:06:59.745 [2024-04-19 10:25:21.593956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.593988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.745 [2024-04-19 10:25:21.594037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.594053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.745 [2024-04-19 10:25:21.594083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.594103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.745 #28 NEW cov: 11876 ft: 14647 corp: 27/58b lim: 5 exec/s: 28 rss: 71Mb L: 3/5 MS: 1 CrossOver- 00:06:59.745 [2024-04-19 10:25:21.644065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.644096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.745 [2024-04-19 10:25:21.644144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.644160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.745 [2024-04-19 10:25:21.644190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.644206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.745 #29 NEW cov: 11876 ft: 14648 corp: 28/61b lim: 5 exec/s: 29 rss: 71Mb L: 3/5 MS: 1 ShuffleBytes- 00:06:59.745 [2024-04-19 10:25:21.694136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.694165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.745 [2024-04-19 10:25:21.694214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.694230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.745 #30 NEW cov: 11876 ft: 14662 corp: 29/63b lim: 5 exec/s: 30 rss: 71Mb L: 2/5 MS: 1 CrossOver- 00:06:59.745 [2024-04-19 10:25:21.764342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.764371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.745 [2024-04-19 10:25:21.764419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.745 [2024-04-19 10:25:21.764435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.745 #31 NEW cov: 11876 ft: 14674 corp: 30/65b lim: 5 exec/s: 15 rss: 71Mb L: 2/5 MS: 1 ChangeBit- 00:06:59.745 #31 DONE cov: 11876 ft: 14674 corp: 30/65b lim: 5 exec/s: 15 rss: 71Mb 00:06:59.745 Done 31 runs in 2 second(s) 00:07:00.004 10:25:21 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:07:00.004 10:25:21 -- ../common.sh@72 -- # (( i++ )) 00:07:00.004 10:25:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:00.004 10:25:21 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:00.004 10:25:21 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:00.004 10:25:21 -- nvmf/run.sh@24 -- # local timen=1 00:07:00.004 10:25:21 -- nvmf/run.sh@25 -- # local core=0x1 00:07:00.004 10:25:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:00.004 10:25:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:00.004 10:25:21 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:00.004 10:25:21 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:00.004 10:25:21 -- nvmf/run.sh@34 -- # printf %02d 9 00:07:00.004 10:25:21 -- nvmf/run.sh@34 -- # port=4409 00:07:00.004 10:25:21 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:00.005 10:25:21 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:00.005 10:25:21 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:00.005 10:25:21 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:00.005 10:25:21 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:00.005 10:25:21 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:07:00.005 [2024-04-19 10:25:21.974426] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:00.005 [2024-04-19 10:25:21.974504] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid203420 ] 00:07:00.005 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.264 [2024-04-19 10:25:22.229580] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.264 [2024-04-19 10:25:22.313708] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.264 [2024-04-19 10:25:22.372682] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:00.523 [2024-04-19 10:25:22.388802] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:00.523 INFO: Running with entropic power schedule (0xFF, 100). 00:07:00.523 INFO: Seed: 3692715771 00:07:00.523 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:00.523 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:00.523 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:00.523 INFO: A corpus is not provided, starting from an empty corpus 00:07:00.523 [2024-04-19 10:25:22.444151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.523 [2024-04-19 10:25:22.444180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.523 #2 INITED cov: 11632 ft: 11622 corp: 1/1b exec/s: 0 rss: 68Mb 00:07:00.523 [2024-04-19 10:25:22.484176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.523 [2024-04-19 10:25:22.484201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.523 #3 NEW cov: 11762 ft: 12269 corp: 2/2b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ChangeBit- 00:07:00.523 [2024-04-19 10:25:22.524288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.523 [2024-04-19 10:25:22.524313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.523 #4 NEW cov: 11768 ft: 12486 corp: 3/3b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ChangeBit- 00:07:00.523 [2024-04-19 10:25:22.564537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.523 [2024-04-19 10:25:22.564562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.523 [2024-04-19 10:25:22.564616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.523 [2024-04-19 10:25:22.564630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.523 #5 NEW cov: 11853 ft: 13406 corp: 4/5b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:00.523 [2024-04-19 10:25:22.604637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.523 [2024-04-19 10:25:22.604663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.523 [2024-04-19 10:25:22.604715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.523 [2024-04-19 10:25:22.604731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.523 #6 NEW cov: 11853 ft: 13494 corp: 5/7b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:00.781 [2024-04-19 10:25:22.644747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.781 [2024-04-19 10:25:22.644771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.781 [2024-04-19 10:25:22.644831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.781 [2024-04-19 10:25:22.644845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.781 #7 NEW cov: 11853 ft: 13572 corp: 6/9b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 ChangeBit- 00:07:00.781 [2024-04-19 10:25:22.694739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.781 [2024-04-19 10:25:22.694764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.781 #8 NEW cov: 11853 ft: 13607 corp: 7/10b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:00.781 [2024-04-19 10:25:22.734840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.781 [2024-04-19 10:25:22.734864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.781 #9 NEW cov: 11853 ft: 13683 corp: 8/11b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeByte- 00:07:00.781 [2024-04-19 10:25:22.774976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.781 [2024-04-19 10:25:22.774999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.781 #10 NEW cov: 11853 ft: 13751 corp: 9/12b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 CrossOver- 00:07:00.781 [2024-04-19 10:25:22.815238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.782 [2024-04-19 10:25:22.815264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.782 [2024-04-19 10:25:22.815319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.782 [2024-04-19 10:25:22.815333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.782 #11 NEW cov: 11853 ft: 13780 corp: 10/14b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 ChangeByte- 00:07:00.782 [2024-04-19 10:25:22.865769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.782 [2024-04-19 10:25:22.865797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.782 [2024-04-19 10:25:22.865855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.782 [2024-04-19 10:25:22.865869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.782 [2024-04-19 10:25:22.865921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.782 [2024-04-19 10:25:22.865935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.040 #12 NEW cov: 11853 ft: 14084 corp: 11/17b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:07:01.040 [2024-04-19 10:25:22.915353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.040 [2024-04-19 10:25:22.915378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.040 #13 NEW cov: 11853 ft: 14097 corp: 12/18b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 ShuffleBytes- 00:07:01.040 [2024-04-19 10:25:22.955748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:22.955772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.041 [2024-04-19 10:25:22.955833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:22.955847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.041 [2024-04-19 10:25:22.955898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:22.955911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.041 #14 NEW cov: 11853 ft: 14183 corp: 13/21b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:07:01.041 [2024-04-19 10:25:23.005743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.005769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.041 [2024-04-19 10:25:23.005847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.005861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.041 #15 NEW cov: 11853 ft: 14196 corp: 14/23b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:07:01.041 [2024-04-19 10:25:23.045974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.045997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.041 [2024-04-19 10:25:23.046051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.046064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.041 [2024-04-19 10:25:23.046116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.046133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.041 #16 NEW cov: 11853 ft: 14211 corp: 15/26b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:07:01.041 [2024-04-19 10:25:23.096338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.096362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.041 [2024-04-19 10:25:23.096415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.096430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.041 [2024-04-19 10:25:23.096482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.096495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.041 [2024-04-19 10:25:23.096546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.096560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.041 #17 NEW cov: 11853 ft: 14492 corp: 16/30b lim: 5 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 InsertByte- 00:07:01.041 [2024-04-19 10:25:23.146018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.041 [2024-04-19 10:25:23.146043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.300 #18 NEW cov: 11853 ft: 14529 corp: 17/31b lim: 5 exec/s: 0 rss: 69Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:01.300 [2024-04-19 10:25:23.186105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.300 [2024-04-19 10:25:23.186128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.300 #19 NEW cov: 11853 ft: 14560 corp: 18/32b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 ChangeByte- 00:07:01.300 [2024-04-19 10:25:23.226368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.300 [2024-04-19 10:25:23.226392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.300 [2024-04-19 10:25:23.226446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.300 [2024-04-19 10:25:23.226460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.300 #20 NEW cov: 11853 ft: 14573 corp: 19/34b lim: 5 exec/s: 0 rss: 70Mb L: 2/4 MS: 1 ChangeBit- 00:07:01.300 [2024-04-19 10:25:23.266481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.300 [2024-04-19 10:25:23.266504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.300 [2024-04-19 10:25:23.266557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.300 [2024-04-19 10:25:23.266573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.300 #21 NEW cov: 11853 ft: 14620 corp: 20/36b lim: 5 exec/s: 0 rss: 70Mb L: 2/4 MS: 1 ChangeBinInt- 00:07:01.300 [2024-04-19 10:25:23.306763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.300 [2024-04-19 10:25:23.306786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.300 [2024-04-19 10:25:23.306841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.300 [2024-04-19 10:25:23.306855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.300 [2024-04-19 10:25:23.306905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.300 [2024-04-19 10:25:23.306918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.559 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:01.559 #22 NEW cov: 11876 ft: 14654 corp: 21/39b lim: 5 exec/s: 22 rss: 71Mb L: 3/4 MS: 1 ChangeBit- 00:07:01.559 [2024-04-19 10:25:23.637564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.559 [2024-04-19 10:25:23.637627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.559 #23 NEW cov: 11876 ft: 14823 corp: 22/40b lim: 5 exec/s: 23 rss: 71Mb L: 1/4 MS: 1 CopyPart- 00:07:01.819 [2024-04-19 10:25:23.677560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.677587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.819 [2024-04-19 10:25:23.677658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.677673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.819 #24 NEW cov: 11876 ft: 14837 corp: 23/42b lim: 5 exec/s: 24 rss: 71Mb L: 2/4 MS: 1 ChangeBit- 00:07:01.819 [2024-04-19 10:25:23.717495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.717519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.819 #25 NEW cov: 11876 ft: 14863 corp: 24/43b lim: 5 exec/s: 25 rss: 71Mb L: 1/4 MS: 1 ChangeBit- 00:07:01.819 [2024-04-19 10:25:23.757750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.757774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.819 [2024-04-19 10:25:23.757834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.757848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.819 #26 NEW cov: 11876 ft: 14875 corp: 25/45b lim: 5 exec/s: 26 rss: 71Mb L: 2/4 MS: 1 ChangeBit- 00:07:01.819 [2024-04-19 10:25:23.797864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.797888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.819 [2024-04-19 10:25:23.797942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.797956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.819 #27 NEW cov: 11876 ft: 14887 corp: 26/47b lim: 5 exec/s: 27 rss: 71Mb L: 2/4 MS: 1 EraseBytes- 00:07:01.819 [2024-04-19 10:25:23.838157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.838181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.819 [2024-04-19 10:25:23.838236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.838250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.819 [2024-04-19 10:25:23.838303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.838317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.819 #28 NEW cov: 11876 ft: 14892 corp: 27/50b lim: 5 exec/s: 28 rss: 71Mb L: 3/4 MS: 1 CrossOver- 00:07:01.819 [2024-04-19 10:25:23.877957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.877981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.819 #29 NEW cov: 11876 ft: 14893 corp: 28/51b lim: 5 exec/s: 29 rss: 71Mb L: 1/4 MS: 1 ChangeBinInt- 00:07:01.819 [2024-04-19 10:25:23.918219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.918243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.819 [2024-04-19 10:25:23.918298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.819 [2024-04-19 10:25:23.918312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.078 #30 NEW cov: 11876 ft: 14910 corp: 29/53b lim: 5 exec/s: 30 rss: 71Mb L: 2/4 MS: 1 ChangeByte- 00:07:02.078 [2024-04-19 10:25:23.968663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.078 [2024-04-19 10:25:23.968687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.078 [2024-04-19 10:25:23.968741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.078 [2024-04-19 10:25:23.968755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.078 [2024-04-19 10:25:23.968812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.078 [2024-04-19 10:25:23.968829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.078 [2024-04-19 10:25:23.968882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.078 [2024-04-19 10:25:23.968896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.078 #31 NEW cov: 11876 ft: 14921 corp: 30/57b lim: 5 exec/s: 31 rss: 72Mb L: 4/4 MS: 1 CopyPart- 00:07:02.078 [2024-04-19 10:25:24.008464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.078 [2024-04-19 10:25:24.008488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.078 [2024-04-19 10:25:24.008542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.078 [2024-04-19 10:25:24.008557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.078 #32 NEW cov: 11876 ft: 14943 corp: 31/59b lim: 5 exec/s: 32 rss: 72Mb L: 2/4 MS: 1 CopyPart- 00:07:02.079 [2024-04-19 10:25:24.048427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.048451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.079 #33 NEW cov: 11876 ft: 14951 corp: 32/60b lim: 5 exec/s: 33 rss: 72Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:02.079 [2024-04-19 10:25:24.088841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.088866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.079 [2024-04-19 10:25:24.088921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.088935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.079 [2024-04-19 10:25:24.088986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.089000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.079 #34 NEW cov: 11876 ft: 14967 corp: 33/63b lim: 5 exec/s: 34 rss: 72Mb L: 3/4 MS: 1 InsertByte- 00:07:02.079 [2024-04-19 10:25:24.128989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.129013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.079 [2024-04-19 10:25:24.129069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.129083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.079 [2024-04-19 10:25:24.129136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.129149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.079 #35 NEW cov: 11876 ft: 14981 corp: 34/66b lim: 5 exec/s: 35 rss: 72Mb L: 3/4 MS: 1 CrossOver- 00:07:02.079 [2024-04-19 10:25:24.169251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.169276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.079 [2024-04-19 10:25:24.169332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.169347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.079 [2024-04-19 10:25:24.169416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.169430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.079 [2024-04-19 10:25:24.169485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.079 [2024-04-19 10:25:24.169498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.338 #36 NEW cov: 11876 ft: 14990 corp: 35/70b lim: 5 exec/s: 36 rss: 72Mb L: 4/4 MS: 1 InsertByte- 00:07:02.338 [2024-04-19 10:25:24.209372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.209396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.209450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.209464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.209518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.209531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.209585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.209598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.338 #37 NEW cov: 11876 ft: 15038 corp: 36/74b lim: 5 exec/s: 37 rss: 72Mb L: 4/4 MS: 1 ChangeByte- 00:07:02.338 [2024-04-19 10:25:24.259500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.259525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.259581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.259595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.259649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.259663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.259719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.259732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.338 #38 NEW cov: 11876 ft: 15042 corp: 37/78b lim: 5 exec/s: 38 rss: 72Mb L: 4/4 MS: 1 InsertByte- 00:07:02.338 [2024-04-19 10:25:24.309292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.309316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.309371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.309385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.349274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.349298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.338 #40 NEW cov: 11876 ft: 15057 corp: 38/79b lim: 5 exec/s: 40 rss: 72Mb L: 1/4 MS: 2 ChangeByte-EraseBytes- 00:07:02.338 [2024-04-19 10:25:24.389655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.389680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.389737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.389751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.389805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.389824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.338 #41 NEW cov: 11876 ft: 15062 corp: 39/82b lim: 5 exec/s: 41 rss: 72Mb L: 3/4 MS: 1 ShuffleBytes- 00:07:02.338 [2024-04-19 10:25:24.429781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.429805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.429865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.429879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.338 [2024-04-19 10:25:24.429949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.338 [2024-04-19 10:25:24.429962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.598 #42 NEW cov: 11876 ft: 15127 corp: 40/85b lim: 5 exec/s: 21 rss: 72Mb L: 3/4 MS: 1 ChangeByte- 00:07:02.598 #42 DONE cov: 11876 ft: 15127 corp: 40/85b lim: 5 exec/s: 21 rss: 72Mb 00:07:02.598 Done 42 runs in 2 second(s) 00:07:02.598 10:25:24 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:07:02.598 10:25:24 -- ../common.sh@72 -- # (( i++ )) 00:07:02.598 10:25:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:02.598 10:25:24 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:02.598 10:25:24 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:02.598 10:25:24 -- nvmf/run.sh@24 -- # local timen=1 00:07:02.598 10:25:24 -- nvmf/run.sh@25 -- # local core=0x1 00:07:02.598 10:25:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:02.598 10:25:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:02.598 10:25:24 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:02.598 10:25:24 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:02.598 10:25:24 -- nvmf/run.sh@34 -- # printf %02d 10 00:07:02.598 10:25:24 -- nvmf/run.sh@34 -- # port=4410 00:07:02.598 10:25:24 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:02.598 10:25:24 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:02.598 10:25:24 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:02.598 10:25:24 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:02.598 10:25:24 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:02.598 10:25:24 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:07:02.598 [2024-04-19 10:25:24.624713] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:02.598 [2024-04-19 10:25:24.624786] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid203767 ] 00:07:02.598 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.857 [2024-04-19 10:25:24.882233] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.857 [2024-04-19 10:25:24.961881] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.116 [2024-04-19 10:25:25.020910] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:03.116 [2024-04-19 10:25:25.037041] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:03.116 INFO: Running with entropic power schedule (0xFF, 100). 00:07:03.116 INFO: Seed: 2045745020 00:07:03.116 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:03.116 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:03.116 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:03.116 INFO: A corpus is not provided, starting from an empty corpus 00:07:03.116 #2 INITED exec/s: 0 rss: 63Mb 00:07:03.116 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:03.116 This may also happen if the target rejected all inputs we tried so far 00:07:03.116 [2024-04-19 10:25:25.086159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.116 [2024-04-19 10:25:25.086186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.375 NEW_FUNC[1/670]: 0x48eb70 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:03.375 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:03.375 #6 NEW cov: 11655 ft: 11656 corp: 2/12b lim: 40 exec/s: 0 rss: 69Mb L: 11/11 MS: 4 InsertByte-ChangeBinInt-CrossOver-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:03.375 [2024-04-19 10:25:25.397237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.397273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.375 [2024-04-19 10:25:25.397328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.397341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.375 [2024-04-19 10:25:25.397397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.397409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.375 [2024-04-19 10:25:25.397464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.397477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.375 #13 NEW cov: 11785 ft: 12877 corp: 3/45b lim: 40 exec/s: 0 rss: 69Mb L: 33/33 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:03.375 [2024-04-19 10:25:25.436926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.436950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.375 #14 NEW cov: 11791 ft: 13016 corp: 4/56b lim: 40 exec/s: 0 rss: 69Mb L: 11/33 MS: 1 CopyPart- 00:07:03.375 [2024-04-19 10:25:25.477329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.477354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.375 [2024-04-19 10:25:25.477410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.477423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.375 [2024-04-19 10:25:25.477479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.477492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.375 [2024-04-19 10:25:25.477548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.375 [2024-04-19 10:25:25.477561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.634 #15 NEW cov: 11876 ft: 13185 corp: 5/89b lim: 40 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 ChangeBit- 00:07:03.634 [2024-04-19 10:25:25.527139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.634 [2024-04-19 10:25:25.527164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.634 #17 NEW cov: 11876 ft: 13382 corp: 6/101b lim: 40 exec/s: 0 rss: 69Mb L: 12/33 MS: 2 InsertRepeatedBytes-PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:03.634 [2024-04-19 10:25:25.567356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.567383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.635 [2024-04-19 10:25:25.567438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000020a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.567451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.635 #18 NEW cov: 11876 ft: 13788 corp: 7/122b lim: 40 exec/s: 0 rss: 70Mb L: 21/33 MS: 1 CopyPart- 00:07:03.635 [2024-04-19 10:25:25.607685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.607710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.635 [2024-04-19 10:25:25.607766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.607780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.635 [2024-04-19 10:25:25.607821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.607832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.635 [2024-04-19 10:25:25.607887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.607900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.635 #20 NEW cov: 11876 ft: 13868 corp: 8/154b lim: 40 exec/s: 0 rss: 70Mb L: 32/33 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:03.635 [2024-04-19 10:25:25.647457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.647482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.635 #21 NEW cov: 11876 ft: 13899 corp: 9/166b lim: 40 exec/s: 0 rss: 70Mb L: 12/33 MS: 1 ShuffleBytes- 00:07:03.635 [2024-04-19 10:25:25.687561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00410005 cdw11:00a8000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.687587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.635 #25 NEW cov: 11876 ft: 13935 corp: 10/174b lim: 40 exec/s: 0 rss: 70Mb L: 8/33 MS: 4 EraseBytes-ChangeByte-ChangeBinInt-InsertByte- 00:07:03.635 [2024-04-19 10:25:25.728063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.728088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.635 [2024-04-19 10:25:25.728144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.728158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.635 [2024-04-19 10:25:25.728212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.728225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.635 [2024-04-19 10:25:25.728282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.635 [2024-04-19 10:25:25.728295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.894 #26 NEW cov: 11876 ft: 13984 corp: 11/207b lim: 40 exec/s: 0 rss: 70Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:03.894 [2024-04-19 10:25:25.777949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.894 [2024-04-19 10:25:25.777973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.894 [2024-04-19 10:25:25.778029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000020a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.894 [2024-04-19 10:25:25.778043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.894 #27 NEW cov: 11876 ft: 14005 corp: 12/223b lim: 40 exec/s: 0 rss: 70Mb L: 16/33 MS: 1 EraseBytes- 00:07:03.894 [2024-04-19 10:25:25.818272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.894 [2024-04-19 10:25:25.818296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.894 [2024-04-19 10:25:25.818369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.894 [2024-04-19 10:25:25.818383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.894 [2024-04-19 10:25:25.818439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.894 [2024-04-19 10:25:25.818455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.894 [2024-04-19 10:25:25.818512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00020a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.894 [2024-04-19 10:25:25.818526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.894 #28 NEW cov: 11876 ft: 14095 corp: 13/259b lim: 40 exec/s: 0 rss: 70Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:03.894 [2024-04-19 10:25:25.858148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.894 [2024-04-19 10:25:25.858172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.894 [2024-04-19 10:25:25.858228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.894 [2024-04-19 10:25:25.858241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.894 #29 NEW cov: 11876 ft: 14133 corp: 14/275b lim: 40 exec/s: 0 rss: 70Mb L: 16/36 MS: 1 ShuffleBytes- 00:07:03.895 [2024-04-19 10:25:25.898369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00000084 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.895 [2024-04-19 10:25:25.898393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.895 [2024-04-19 10:25:25.898449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:874f7716 cdw11:f919000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.895 [2024-04-19 10:25:25.898465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.895 [2024-04-19 10:25:25.898536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.895 [2024-04-19 10:25:25.898549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.895 #30 NEW cov: 11876 ft: 14377 corp: 15/299b lim: 40 exec/s: 0 rss: 70Mb L: 24/36 MS: 1 CMP- DE: "\204\207Ow\026\371\031\000"- 00:07:03.895 [2024-04-19 10:25:25.938355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.895 [2024-04-19 10:25:25.938380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.895 [2024-04-19 10:25:25.938436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:fffffffd cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.895 [2024-04-19 10:25:25.938448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.895 #31 NEW cov: 11876 ft: 14449 corp: 16/317b lim: 40 exec/s: 0 rss: 70Mb L: 18/36 MS: 1 EraseBytes- 00:07:03.895 [2024-04-19 10:25:25.978397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.895 [2024-04-19 10:25:25.978421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.154 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:04.154 #32 NEW cov: 11899 ft: 14485 corp: 17/329b lim: 40 exec/s: 0 rss: 70Mb L: 12/36 MS: 1 ChangeBinInt- 00:07:04.154 [2024-04-19 10:25:26.018503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.018528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.154 #33 NEW cov: 11899 ft: 14508 corp: 18/341b lim: 40 exec/s: 0 rss: 70Mb L: 12/36 MS: 1 CrossOver- 00:07:04.154 [2024-04-19 10:25:26.058706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.058729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.154 [2024-04-19 10:25:26.058788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0c888888 cdw11:88888888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.058801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.154 #34 NEW cov: 11899 ft: 14514 corp: 19/363b lim: 40 exec/s: 34 rss: 70Mb L: 22/36 MS: 1 InsertRepeatedBytes- 00:07:04.154 [2024-04-19 10:25:26.098832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.098857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.154 [2024-04-19 10:25:26.098916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.098929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.154 #35 NEW cov: 11899 ft: 14546 corp: 20/383b lim: 40 exec/s: 35 rss: 70Mb L: 20/36 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:04.154 [2024-04-19 10:25:26.138840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00410005 cdw11:00a80051 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.138865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.154 #36 NEW cov: 11899 ft: 14571 corp: 21/391b lim: 40 exec/s: 36 rss: 71Mb L: 8/36 MS: 1 ChangeByte- 00:07:04.154 [2024-04-19 10:25:26.178947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.178970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.154 #37 NEW cov: 11899 ft: 14572 corp: 22/403b lim: 40 exec/s: 37 rss: 71Mb L: 12/36 MS: 1 ChangeBinInt- 00:07:04.154 [2024-04-19 10:25:26.209165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.209189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.154 [2024-04-19 10:25:26.209248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0200000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.209261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.154 #38 NEW cov: 11899 ft: 14660 corp: 23/424b lim: 40 exec/s: 38 rss: 71Mb L: 21/36 MS: 1 ShuffleBytes- 00:07:04.154 [2024-04-19 10:25:26.249370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00000084 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.249394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.154 [2024-04-19 10:25:26.249448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:874f7716 cdw11:11e6000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.249462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.154 [2024-04-19 10:25:26.249518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.154 [2024-04-19 10:25:26.249530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.413 #39 NEW cov: 11899 ft: 14663 corp: 24/448b lim: 40 exec/s: 39 rss: 71Mb L: 24/36 MS: 1 ChangeBinInt- 00:07:04.413 [2024-04-19 10:25:26.289649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.289673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.289730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff0000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.289743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.289800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.289817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.289873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fdffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.289889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.413 #40 NEW cov: 11899 ft: 14678 corp: 25/483b lim: 40 exec/s: 40 rss: 71Mb L: 35/36 MS: 1 CrossOver- 00:07:04.413 [2024-04-19 10:25:26.339736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffff64ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.339760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.339820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.339834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.339889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.339902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.339957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffdffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.339970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.413 #41 NEW cov: 11899 ft: 14698 corp: 26/519b lim: 40 exec/s: 41 rss: 71Mb L: 36/36 MS: 1 InsertByte- 00:07:04.413 [2024-04-19 10:25:26.389646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7bfffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.389670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.389727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:fffffffd cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.389740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.413 #42 NEW cov: 11899 ft: 14771 corp: 27/537b lim: 40 exec/s: 42 rss: 71Mb L: 18/36 MS: 1 ChangeBit- 00:07:04.413 [2024-04-19 10:25:26.429641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7bfffff cdw11:fffffffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.429665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.413 #43 NEW cov: 11899 ft: 14792 corp: 28/551b lim: 40 exec/s: 43 rss: 72Mb L: 14/36 MS: 1 EraseBytes- 00:07:04.413 [2024-04-19 10:25:26.470108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffffff64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.470133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.470191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.470204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.470262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.470275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.470334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffdffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.470347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.413 #44 NEW cov: 11899 ft: 14809 corp: 29/587b lim: 40 exec/s: 44 rss: 72Mb L: 36/36 MS: 1 ShuffleBytes- 00:07:04.413 [2024-04-19 10:25:26.510153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00000084 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.510177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.510235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:874f7716 cdw11:11e6000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.510249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.413 [2024-04-19 10:25:26.510303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.413 [2024-04-19 10:25:26.510316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.673 #45 NEW cov: 11899 ft: 14815 corp: 30/618b lim: 40 exec/s: 45 rss: 72Mb L: 31/36 MS: 1 CopyPart- 00:07:04.673 [2024-04-19 10:25:26.550103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00410005 cdw11:00a88487 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.550127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.550185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:4f7716f9 cdw11:1900000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.550198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.673 #46 NEW cov: 11899 ft: 14818 corp: 31/634b lim: 40 exec/s: 46 rss: 72Mb L: 16/36 MS: 1 PersAutoDict- DE: "\204\207Ow\026\371\031\000"- 00:07:04.673 [2024-04-19 10:25:26.590308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.590332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.590392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000fdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.590406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.590461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfd000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.590475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.673 #47 NEW cov: 11899 ft: 14833 corp: 32/658b lim: 40 exec/s: 47 rss: 72Mb L: 24/36 MS: 1 InsertRepeatedBytes- 00:07:04.673 [2024-04-19 10:25:26.630480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.630504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.630562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.630579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.630634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.630647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.630702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.630715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.673 #48 NEW cov: 11899 ft: 14836 corp: 33/691b lim: 40 exec/s: 48 rss: 72Mb L: 33/36 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:04.673 [2024-04-19 10:25:26.670244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00410005 cdw11:00a8000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.670267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.673 #49 NEW cov: 11899 ft: 14879 corp: 34/699b lim: 40 exec/s: 49 rss: 72Mb L: 8/36 MS: 1 ShuffleBytes- 00:07:04.673 [2024-04-19 10:25:26.710730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a7ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.710754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.710814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.710827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.710900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.710914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.710969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffffffdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.710982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.673 #50 NEW cov: 11899 ft: 14900 corp: 35/732b lim: 40 exec/s: 50 rss: 72Mb L: 33/36 MS: 1 ChangeBit- 00:07:04.673 [2024-04-19 10:25:26.750861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:020a0000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.750885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.750943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.750957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.751013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.751026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.673 [2024-04-19 10:25:26.751085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00027a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.673 [2024-04-19 10:25:26.751098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.673 #51 NEW cov: 11899 ft: 14931 corp: 36/768b lim: 40 exec/s: 51 rss: 72Mb L: 36/36 MS: 1 ChangeByte- 00:07:04.934 [2024-04-19 10:25:26.790741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.790764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:26.790824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0c888888 cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.790853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.934 #52 NEW cov: 11899 ft: 14989 corp: 37/790b lim: 40 exec/s: 52 rss: 72Mb L: 22/36 MS: 1 CopyPart- 00:07:04.934 [2024-04-19 10:25:26.831087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.831112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:26.831171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.831184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:26.831239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.831252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:26.831310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.831323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.934 #53 NEW cov: 11899 ft: 14990 corp: 38/822b lim: 40 exec/s: 53 rss: 72Mb L: 32/36 MS: 1 ChangeBit- 00:07:04.934 [2024-04-19 10:25:26.871234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffa7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.871260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:26.871318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.871332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:26.871387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.871401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:26.871457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffffffdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.871470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.934 #54 NEW cov: 11899 ft: 15006 corp: 39/855b lim: 40 exec/s: 54 rss: 72Mb L: 33/36 MS: 1 ShuffleBytes- 00:07:04.934 [2024-04-19 10:25:26.921119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.921142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:26.921198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.921212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.934 #55 NEW cov: 11899 ft: 15021 corp: 40/873b lim: 40 exec/s: 55 rss: 72Mb L: 18/36 MS: 1 CopyPart- 00:07:04.934 [2024-04-19 10:25:26.961114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:26.961138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.934 #56 NEW cov: 11899 ft: 15052 corp: 41/885b lim: 40 exec/s: 56 rss: 72Mb L: 12/36 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:04.934 [2024-04-19 10:25:27.001368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:27.001392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.934 [2024-04-19 10:25:27.001448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:000000fe cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:27.001462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.934 #57 NEW cov: 11899 ft: 15118 corp: 42/903b lim: 40 exec/s: 57 rss: 72Mb L: 18/36 MS: 1 ChangeBinInt- 00:07:04.934 [2024-04-19 10:25:27.041378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:08a8000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.934 [2024-04-19 10:25:27.041402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.194 #58 NEW cov: 11899 ft: 15121 corp: 43/911b lim: 40 exec/s: 58 rss: 72Mb L: 8/36 MS: 1 ChangeBinInt- 00:07:05.194 [2024-04-19 10:25:27.081593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:84874f77 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.194 [2024-04-19 10:25:27.081616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.194 [2024-04-19 10:25:27.081673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16f91900 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.194 [2024-04-19 10:25:27.081687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.194 #59 NEW cov: 11899 ft: 15157 corp: 44/931b lim: 40 exec/s: 29 rss: 72Mb L: 20/36 MS: 1 PersAutoDict- DE: "\204\207Ow\026\371\031\000"- 00:07:05.194 #59 DONE cov: 11899 ft: 15157 corp: 44/931b lim: 40 exec/s: 29 rss: 72Mb 00:07:05.194 ###### Recommended dictionary. ###### 00:07:05.194 "\000\000\000\000\000\000\000\000" # Uses: 4 00:07:05.194 "\204\207Ow\026\371\031\000" # Uses: 2 00:07:05.194 ###### End of recommended dictionary. ###### 00:07:05.194 Done 59 runs in 2 second(s) 00:07:05.194 10:25:27 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:07:05.194 10:25:27 -- ../common.sh@72 -- # (( i++ )) 00:07:05.194 10:25:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:05.194 10:25:27 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:05.194 10:25:27 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:05.194 10:25:27 -- nvmf/run.sh@24 -- # local timen=1 00:07:05.194 10:25:27 -- nvmf/run.sh@25 -- # local core=0x1 00:07:05.194 10:25:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:05.194 10:25:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:05.194 10:25:27 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:05.194 10:25:27 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:05.194 10:25:27 -- nvmf/run.sh@34 -- # printf %02d 11 00:07:05.194 10:25:27 -- nvmf/run.sh@34 -- # port=4411 00:07:05.194 10:25:27 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:05.194 10:25:27 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:05.194 10:25:27 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:05.194 10:25:27 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:05.194 10:25:27 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:05.194 10:25:27 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:07:05.194 [2024-04-19 10:25:27.283442] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:05.194 [2024-04-19 10:25:27.283514] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid204120 ] 00:07:05.454 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.454 [2024-04-19 10:25:27.539581] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.713 [2024-04-19 10:25:27.618367] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.713 [2024-04-19 10:25:27.677325] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:05.713 [2024-04-19 10:25:27.693486] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:05.713 INFO: Running with entropic power schedule (0xFF, 100). 00:07:05.713 INFO: Seed: 408788171 00:07:05.713 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:05.713 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:05.713 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:05.713 INFO: A corpus is not provided, starting from an empty corpus 00:07:05.713 #2 INITED exec/s: 0 rss: 63Mb 00:07:05.713 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:05.713 This may also happen if the target rejected all inputs we tried so far 00:07:05.713 [2024-04-19 10:25:27.748783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.713 [2024-04-19 10:25:27.748818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.973 NEW_FUNC[1/671]: 0x4908e0 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:05.973 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:05.973 #9 NEW cov: 11667 ft: 11654 corp: 2/10b lim: 40 exec/s: 0 rss: 69Mb L: 9/9 MS: 2 ChangeBinInt-CMP- DE: "\377\377\377\377\377\377\377G"- 00:07:05.973 [2024-04-19 10:25:28.069699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.973 [2024-04-19 10:25:28.069736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.973 [2024-04-19 10:25:28.069798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff47ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.973 [2024-04-19 10:25:28.069818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.232 #10 NEW cov: 11797 ft: 12804 corp: 3/27b lim: 40 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377G"- 00:07:06.232 [2024-04-19 10:25:28.119730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aedffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.232 [2024-04-19 10:25:28.119756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.232 [2024-04-19 10:25:28.119814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff47ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.232 [2024-04-19 10:25:28.119827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.233 #11 NEW cov: 11803 ft: 12920 corp: 4/45b lim: 40 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 CrossOver- 00:07:06.233 [2024-04-19 10:25:28.159698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:fdffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.233 [2024-04-19 10:25:28.159722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.233 #12 NEW cov: 11888 ft: 13252 corp: 5/54b lim: 40 exec/s: 0 rss: 69Mb L: 9/18 MS: 1 ChangeBit- 00:07:06.233 [2024-04-19 10:25:28.199822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edff7fff cdw11:fdffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.233 [2024-04-19 10:25:28.199847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.233 #13 NEW cov: 11888 ft: 13368 corp: 6/63b lim: 40 exec/s: 0 rss: 69Mb L: 9/18 MS: 1 ChangeBit- 00:07:06.233 [2024-04-19 10:25:28.239912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.233 [2024-04-19 10:25:28.239936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.233 #14 NEW cov: 11888 ft: 13522 corp: 7/72b lim: 40 exec/s: 0 rss: 69Mb L: 9/18 MS: 1 ChangeByte- 00:07:06.233 [2024-04-19 10:25:28.280064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff47 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.233 [2024-04-19 10:25:28.280088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.233 #15 NEW cov: 11888 ft: 13566 corp: 8/81b lim: 40 exec/s: 0 rss: 69Mb L: 9/18 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377G"- 00:07:06.233 [2024-04-19 10:25:28.320136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffc cdw11:ffffff47 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.233 [2024-04-19 10:25:28.320161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.492 #16 NEW cov: 11888 ft: 13630 corp: 9/90b lim: 40 exec/s: 0 rss: 70Mb L: 9/18 MS: 1 ChangeBinInt- 00:07:06.492 [2024-04-19 10:25:28.360274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.360299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.492 #17 NEW cov: 11888 ft: 13676 corp: 10/99b lim: 40 exec/s: 0 rss: 70Mb L: 9/18 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377G"- 00:07:06.492 [2024-04-19 10:25:28.400524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ed000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.400549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.492 [2024-04-19 10:25:28.400602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ff47ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.400616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.492 #18 NEW cov: 11888 ft: 13781 corp: 11/116b lim: 40 exec/s: 0 rss: 70Mb L: 17/18 MS: 1 ChangeBinInt- 00:07:06.492 [2024-04-19 10:25:28.440475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.440498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.492 #20 NEW cov: 11888 ft: 13824 corp: 12/125b lim: 40 exec/s: 0 rss: 70Mb L: 9/18 MS: 2 ShuffleBytes-CMP- DE: "\001\000\000\000\000\000\004\000"- 00:07:06.492 [2024-04-19 10:25:28.481072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.481096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.492 [2024-04-19 10:25:28.481153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.481167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.492 [2024-04-19 10:25:28.481219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.481233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.492 [2024-04-19 10:25:28.481288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff02 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.481301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.492 #22 NEW cov: 11888 ft: 14190 corp: 13/157b lim: 40 exec/s: 0 rss: 70Mb L: 32/32 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:06.492 [2024-04-19 10:25:28.520729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffedff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.520753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.492 #23 NEW cov: 11888 ft: 14209 corp: 14/172b lim: 40 exec/s: 0 rss: 70Mb L: 15/32 MS: 1 CopyPart- 00:07:06.492 [2024-04-19 10:25:28.560857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.560882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.492 #24 NEW cov: 11888 ft: 14292 corp: 15/182b lim: 40 exec/s: 0 rss: 70Mb L: 10/32 MS: 1 EraseBytes- 00:07:06.492 [2024-04-19 10:25:28.600989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffedffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.492 [2024-04-19 10:25:28.601014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.752 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:06.752 #25 NEW cov: 11911 ft: 14314 corp: 16/190b lim: 40 exec/s: 0 rss: 70Mb L: 8/32 MS: 1 CrossOver- 00:07:06.752 [2024-04-19 10:25:28.641024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.641048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.752 #26 NEW cov: 11911 ft: 14359 corp: 17/199b lim: 40 exec/s: 0 rss: 70Mb L: 9/32 MS: 1 ChangeByte- 00:07:06.752 [2024-04-19 10:25:28.681137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffc cdw11:ff47ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.681161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.752 #27 NEW cov: 11911 ft: 14377 corp: 18/208b lim: 40 exec/s: 0 rss: 70Mb L: 9/32 MS: 1 ShuffleBytes- 00:07:06.752 [2024-04-19 10:25:28.721573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.721598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.752 [2024-04-19 10:25:28.721651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.721665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.752 [2024-04-19 10:25:28.721720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:fffdffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.721734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.752 #28 NEW cov: 11911 ft: 14575 corp: 19/234b lim: 40 exec/s: 28 rss: 70Mb L: 26/32 MS: 1 InsertRepeatedBytes- 00:07:06.752 [2024-04-19 10:25:28.761406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.761430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.752 #29 NEW cov: 11911 ft: 14595 corp: 20/244b lim: 40 exec/s: 29 rss: 70Mb L: 10/32 MS: 1 InsertByte- 00:07:06.752 [2024-04-19 10:25:28.801942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.801966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.752 [2024-04-19 10:25:28.802022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff47ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.802035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.752 [2024-04-19 10:25:28.802087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.802101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.752 [2024-04-19 10:25:28.802156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff47 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.802169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.752 #30 NEW cov: 11911 ft: 14605 corp: 21/276b lim: 40 exec/s: 30 rss: 71Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:06.752 [2024-04-19 10:25:28.841925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffedff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.841948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.752 [2024-04-19 10:25:28.842005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.842018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.752 [2024-04-19 10:25:28.842072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff47ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.752 [2024-04-19 10:25:28.842085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.012 #31 NEW cov: 11911 ft: 14661 corp: 22/306b lim: 40 exec/s: 31 rss: 71Mb L: 30/32 MS: 1 CrossOver- 00:07:07.012 [2024-04-19 10:25:28.882043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:28.882067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.012 [2024-04-19 10:25:28.882122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00300000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:28.882135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.012 [2024-04-19 10:25:28.882187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:fffffdff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:28.882201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.012 #32 NEW cov: 11911 ft: 14688 corp: 23/333b lim: 40 exec/s: 32 rss: 71Mb L: 27/32 MS: 1 InsertByte- 00:07:07.012 [2024-04-19 10:25:28.932017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:fdffffed SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:28.932041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.012 [2024-04-19 10:25:28.932098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffffffd cdw11:ffffff47 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:28.932112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.012 #33 NEW cov: 11911 ft: 14691 corp: 24/351b lim: 40 exec/s: 33 rss: 71Mb L: 18/32 MS: 1 CopyPart- 00:07:07.012 [2024-04-19 10:25:28.971939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:28.971963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.012 #34 NEW cov: 11911 ft: 14737 corp: 25/361b lim: 40 exec/s: 34 rss: 71Mb L: 10/32 MS: 1 ChangeBinInt- 00:07:07.012 [2024-04-19 10:25:29.012046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:29.012070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.012 #35 NEW cov: 11911 ft: 14760 corp: 26/370b lim: 40 exec/s: 35 rss: 71Mb L: 9/32 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\004\000"- 00:07:07.012 [2024-04-19 10:25:29.042141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffedff cdw11:fffdffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:29.042169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.012 #36 NEW cov: 11911 ft: 14819 corp: 27/379b lim: 40 exec/s: 36 rss: 71Mb L: 9/32 MS: 1 ShuffleBytes- 00:07:07.012 [2024-04-19 10:25:29.082388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:fdffffed SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:29.082412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.012 [2024-04-19 10:25:29.082482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffffffd cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:29.082495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.012 #37 NEW cov: 11911 ft: 14864 corp: 28/397b lim: 40 exec/s: 37 rss: 71Mb L: 18/32 MS: 1 ShuffleBytes- 00:07:07.012 [2024-04-19 10:25:29.122596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ed000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:29.122621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.012 [2024-04-19 10:25:29.122680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ff47ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.012 [2024-04-19 10:25:29.122697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.272 #38 NEW cov: 11911 ft: 14879 corp: 29/414b lim: 40 exec/s: 38 rss: 71Mb L: 17/32 MS: 1 CopyPart- 00:07:07.272 [2024-04-19 10:25:29.172679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:fdffffed SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.272 [2024-04-19 10:25:29.172703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.272 [2024-04-19 10:25:29.172760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffffffd cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.272 [2024-04-19 10:25:29.172773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.272 #39 NEW cov: 11911 ft: 14895 corp: 30/432b lim: 40 exec/s: 39 rss: 72Mb L: 18/32 MS: 1 ChangeByte- 00:07:07.272 [2024-04-19 10:25:29.222691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00040047 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.272 [2024-04-19 10:25:29.222715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.272 #40 NEW cov: 11911 ft: 14913 corp: 31/440b lim: 40 exec/s: 40 rss: 72Mb L: 8/32 MS: 1 EraseBytes- 00:07:07.272 [2024-04-19 10:25:29.262787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00009f00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.272 [2024-04-19 10:25:29.262815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.272 #41 NEW cov: 11911 ft: 14926 corp: 32/449b lim: 40 exec/s: 41 rss: 72Mb L: 9/32 MS: 1 ChangeByte- 00:07:07.272 [2024-04-19 10:25:29.303099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000ed cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.272 [2024-04-19 10:25:29.303124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.272 [2024-04-19 10:25:29.303181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0400fffd cdw11:47ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.272 [2024-04-19 10:25:29.303197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.272 #42 NEW cov: 11911 ft: 14938 corp: 33/466b lim: 40 exec/s: 42 rss: 72Mb L: 17/32 MS: 1 CrossOver- 00:07:07.272 [2024-04-19 10:25:29.343033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ff24ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.272 [2024-04-19 10:25:29.343056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.272 #43 NEW cov: 11911 ft: 14952 corp: 34/475b lim: 40 exec/s: 43 rss: 72Mb L: 9/32 MS: 1 ChangeByte- 00:07:07.532 [2024-04-19 10:25:29.383179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffc cdw11:ffff47ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.532 [2024-04-19 10:25:29.383204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.532 #44 NEW cov: 11911 ft: 15001 corp: 35/484b lim: 40 exec/s: 44 rss: 72Mb L: 9/32 MS: 1 ShuffleBytes- 00:07:07.532 [2024-04-19 10:25:29.423378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.532 [2024-04-19 10:25:29.423402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.532 [2024-04-19 10:25:29.423457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:38383838 cdw11:38383838 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.532 [2024-04-19 10:25:29.423470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.532 #45 NEW cov: 11911 ft: 15028 corp: 36/503b lim: 40 exec/s: 45 rss: 72Mb L: 19/32 MS: 1 InsertRepeatedBytes- 00:07:07.532 [2024-04-19 10:25:29.463384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000423 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.532 [2024-04-19 10:25:29.463408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.532 [2024-04-19 10:25:29.503506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000423 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.532 [2024-04-19 10:25:29.503531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.532 #49 NEW cov: 11911 ft: 15040 corp: 37/517b lim: 40 exec/s: 49 rss: 72Mb L: 14/32 MS: 4 EraseBytes-InsertByte-PersAutoDict-CMP- DE: "\377\377\377\377\377\377\377G"-"\377\377\377\377\377\377\377\377"- 00:07:07.532 [2024-04-19 10:25:29.543897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edff2100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.532 [2024-04-19 10:25:29.543922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.532 [2024-04-19 10:25:29.543980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.532 [2024-04-19 10:25:29.543994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.532 [2024-04-19 10:25:29.544048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:fffffdff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.533 [2024-04-19 10:25:29.544063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.533 #50 NEW cov: 11911 ft: 15066 corp: 38/544b lim: 40 exec/s: 50 rss: 72Mb L: 27/32 MS: 1 InsertByte- 00:07:07.533 [2024-04-19 10:25:29.583744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f0ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.533 [2024-04-19 10:25:29.583772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.533 #51 NEW cov: 11911 ft: 15081 corp: 39/554b lim: 40 exec/s: 51 rss: 72Mb L: 10/32 MS: 1 ChangeBinInt- 00:07:07.533 [2024-04-19 10:25:29.624159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ed000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.533 [2024-04-19 10:25:29.624184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.533 [2024-04-19 10:25:29.624241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.533 [2024-04-19 10:25:29.624254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.533 [2024-04-19 10:25:29.624310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.533 [2024-04-19 10:25:29.624324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.793 #52 NEW cov: 11911 ft: 15110 corp: 40/582b lim: 40 exec/s: 52 rss: 72Mb L: 28/32 MS: 1 InsertRepeatedBytes- 00:07:07.793 [2024-04-19 10:25:29.664583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000ed cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.793 [2024-04-19 10:25:29.664608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.793 [2024-04-19 10:25:29.664662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.793 [2024-04-19 10:25:29.664675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.793 [2024-04-19 10:25:29.664727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.793 [2024-04-19 10:25:29.664741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.793 [2024-04-19 10:25:29.664794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.793 [2024-04-19 10:25:29.664807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.793 [2024-04-19 10:25:29.664869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00fffd47 cdw11:ffffff47 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.793 [2024-04-19 10:25:29.664882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:07.793 #53 NEW cov: 11911 ft: 15193 corp: 41/622b lim: 40 exec/s: 53 rss: 72Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:07.793 [2024-04-19 10:25:29.714250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:edffffff cdw11:fdffbfed SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.793 [2024-04-19 10:25:29.714275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.793 [2024-04-19 10:25:29.714329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffffffd cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.793 [2024-04-19 10:25:29.714342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.793 #54 NEW cov: 11911 ft: 15203 corp: 42/640b lim: 40 exec/s: 27 rss: 72Mb L: 18/40 MS: 1 ChangeBit- 00:07:07.793 #54 DONE cov: 11911 ft: 15203 corp: 42/640b lim: 40 exec/s: 27 rss: 72Mb 00:07:07.793 ###### Recommended dictionary. ###### 00:07:07.793 "\377\377\377\377\377\377\377G" # Uses: 4 00:07:07.793 "\001\000\000\000\000\000\004\000" # Uses: 1 00:07:07.793 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:07.793 ###### End of recommended dictionary. ###### 00:07:07.793 Done 54 runs in 2 second(s) 00:07:07.793 10:25:29 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:07:07.793 10:25:29 -- ../common.sh@72 -- # (( i++ )) 00:07:07.793 10:25:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:07.793 10:25:29 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:07.793 10:25:29 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:07.793 10:25:29 -- nvmf/run.sh@24 -- # local timen=1 00:07:07.793 10:25:29 -- nvmf/run.sh@25 -- # local core=0x1 00:07:07.793 10:25:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:07.793 10:25:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:07.793 10:25:29 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:07.793 10:25:29 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:07.793 10:25:29 -- nvmf/run.sh@34 -- # printf %02d 12 00:07:07.793 10:25:29 -- nvmf/run.sh@34 -- # port=4412 00:07:07.793 10:25:29 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:07.793 10:25:29 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:07.793 10:25:29 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:07.793 10:25:29 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:07.793 10:25:29 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:07.793 10:25:29 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:07:08.052 [2024-04-19 10:25:29.904420] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:08.052 [2024-04-19 10:25:29.904493] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid204477 ] 00:07:08.052 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.311 [2024-04-19 10:25:30.166017] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.311 [2024-04-19 10:25:30.251290] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.311 [2024-04-19 10:25:30.310774] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:08.311 [2024-04-19 10:25:30.326931] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:08.311 INFO: Running with entropic power schedule (0xFF, 100). 00:07:08.311 INFO: Seed: 3041786836 00:07:08.311 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:08.311 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:08.311 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:08.311 INFO: A corpus is not provided, starting from an empty corpus 00:07:08.311 #2 INITED exec/s: 0 rss: 62Mb 00:07:08.311 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:08.311 This may also happen if the target rejected all inputs we tried so far 00:07:08.311 [2024-04-19 10:25:30.372632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.311 [2024-04-19 10:25:30.372662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.311 [2024-04-19 10:25:30.372721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.311 [2024-04-19 10:25:30.372738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.311 [2024-04-19 10:25:30.372793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.311 [2024-04-19 10:25:30.372807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.880 NEW_FUNC[1/671]: 0x492650 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:08.880 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:08.880 #5 NEW cov: 11665 ft: 11666 corp: 2/25b lim: 40 exec/s: 0 rss: 69Mb L: 24/24 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:08.880 [2024-04-19 10:25:30.713400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.713437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.880 [2024-04-19 10:25:30.713493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.713508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.880 [2024-04-19 10:25:30.713562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffbffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.713575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.880 #6 NEW cov: 11795 ft: 12202 corp: 3/49b lim: 40 exec/s: 0 rss: 69Mb L: 24/24 MS: 1 ChangeBit- 00:07:08.880 [2024-04-19 10:25:30.763548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.763573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.880 [2024-04-19 10:25:30.763627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.763640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.880 [2024-04-19 10:25:30.763708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffffbff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.763721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.880 [2024-04-19 10:25:30.763774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.763787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.880 #7 NEW cov: 11801 ft: 12714 corp: 4/87b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CopyPart- 00:07:08.880 [2024-04-19 10:25:30.813650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.813674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.880 [2024-04-19 10:25:30.813729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.813746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.880 [2024-04-19 10:25:30.813799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.813817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.880 [2024-04-19 10:25:30.813869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:fffffffb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.813882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.880 #8 NEW cov: 11886 ft: 13014 corp: 5/121b lim: 40 exec/s: 0 rss: 70Mb L: 34/38 MS: 1 CopyPart- 00:07:08.880 [2024-04-19 10:25:30.853344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ff0128c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.880 [2024-04-19 10:25:30.853367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.880 #9 NEW cov: 11886 ft: 13897 corp: 6/130b lim: 40 exec/s: 0 rss: 70Mb L: 9/38 MS: 1 CMP- DE: "\377\377\377\377\001(\303\363"- 00:07:08.881 [2024-04-19 10:25:30.893578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.893603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.881 [2024-04-19 10:25:30.893657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.893670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.881 #10 NEW cov: 11886 ft: 14116 corp: 7/152b lim: 40 exec/s: 0 rss: 70Mb L: 22/38 MS: 1 EraseBytes- 00:07:08.881 [2024-04-19 10:25:30.934036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.934061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.881 [2024-04-19 10:25:30.934114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.934128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.881 [2024-04-19 10:25:30.934181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.934194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.881 [2024-04-19 10:25:30.934244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fbffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.934257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.881 #11 NEW cov: 11886 ft: 14221 corp: 8/187b lim: 40 exec/s: 0 rss: 70Mb L: 35/38 MS: 1 InsertRepeatedBytes- 00:07:08.881 [2024-04-19 10:25:30.974155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffe4 cdw11:a0686914 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.974179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.881 [2024-04-19 10:25:30.974236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:f91900ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.974249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.881 [2024-04-19 10:25:30.974317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffffbff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.974330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.881 [2024-04-19 10:25:30.974382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.881 [2024-04-19 10:25:30.974395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.140 #12 NEW cov: 11886 ft: 14277 corp: 9/225b lim: 40 exec/s: 0 rss: 70Mb L: 38/38 MS: 1 CMP- DE: "\344\240hi\024\371\031\000"- 00:07:09.141 [2024-04-19 10:25:31.024302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.024328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.024381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.024395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.024445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.024459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.024509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.024522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.141 #13 NEW cov: 11886 ft: 14314 corp: 10/264b lim: 40 exec/s: 0 rss: 70Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:09.141 [2024-04-19 10:25:31.064575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.064600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.064653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.064667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.064719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aaffaaaa cdw11:aaaaffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.064732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.064784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.064797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.064858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:fffbffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.064872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:09.141 #14 NEW cov: 11886 ft: 14408 corp: 11/304b lim: 40 exec/s: 0 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:07:09.141 [2024-04-19 10:25:31.104534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.104559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.104612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.104626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.104678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffffbff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.104691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.104742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.104755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.141 #15 NEW cov: 11886 ft: 14466 corp: 12/343b lim: 40 exec/s: 0 rss: 70Mb L: 39/40 MS: 1 InsertByte- 00:07:09.141 [2024-04-19 10:25:31.144601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.144626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.144682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:000000f7 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.144695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.144746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.144759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.144817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:fffffffb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.144830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.141 #16 NEW cov: 11886 ft: 14501 corp: 13/377b lim: 40 exec/s: 0 rss: 70Mb L: 34/40 MS: 1 ChangeBinInt- 00:07:09.141 [2024-04-19 10:25:31.184275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ff0128c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.184299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.141 #17 NEW cov: 11886 ft: 14539 corp: 14/386b lim: 40 exec/s: 0 rss: 70Mb L: 9/40 MS: 1 CopyPart- 00:07:09.141 [2024-04-19 10:25:31.234830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.234857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.234910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.234923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.234992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffffbff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.235005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.141 [2024-04-19 10:25:31.235057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.141 [2024-04-19 10:25:31.235070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.401 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:09.401 #18 NEW cov: 11909 ft: 14623 corp: 15/424b lim: 40 exec/s: 0 rss: 70Mb L: 38/40 MS: 1 ShuffleBytes- 00:07:09.401 [2024-04-19 10:25:31.274497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff0128c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.274523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.401 #19 NEW cov: 11909 ft: 14669 corp: 16/433b lim: 40 exec/s: 0 rss: 70Mb L: 9/40 MS: 1 CrossOver- 00:07:09.401 [2024-04-19 10:25:31.324971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.324995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.401 [2024-04-19 10:25:31.325051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:07000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.325064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.401 [2024-04-19 10:25:31.325117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.325129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.401 #20 NEW cov: 11909 ft: 14675 corp: 17/457b lim: 40 exec/s: 0 rss: 70Mb L: 24/40 MS: 1 ChangeBinInt- 00:07:09.401 [2024-04-19 10:25:31.364761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff0128c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.364784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.401 #21 NEW cov: 11909 ft: 14688 corp: 18/467b lim: 40 exec/s: 21 rss: 70Mb L: 10/40 MS: 1 CrossOver- 00:07:09.401 [2024-04-19 10:25:31.414906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ff0128c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.414930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.401 #22 NEW cov: 11909 ft: 14701 corp: 19/480b lim: 40 exec/s: 22 rss: 70Mb L: 13/40 MS: 1 CrossOver- 00:07:09.401 [2024-04-19 10:25:31.455423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.455447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.401 [2024-04-19 10:25:31.455504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000030 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.455518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.401 [2024-04-19 10:25:31.455570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.455583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.401 [2024-04-19 10:25:31.455634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:fffffffb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.455648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.401 #23 NEW cov: 11909 ft: 14746 corp: 20/514b lim: 40 exec/s: 23 rss: 71Mb L: 34/40 MS: 1 ChangeByte- 00:07:09.401 [2024-04-19 10:25:31.495567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.495590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.401 [2024-04-19 10:25:31.495644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.495658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.401 [2024-04-19 10:25:31.495726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:5bfffffb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.495739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.401 [2024-04-19 10:25:31.495793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.401 [2024-04-19 10:25:31.495806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.661 #24 NEW cov: 11909 ft: 14765 corp: 21/553b lim: 40 exec/s: 24 rss: 71Mb L: 39/40 MS: 1 InsertByte- 00:07:09.661 [2024-04-19 10:25:31.545710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffe4a068 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.545734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.545788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6914f919 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.545801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.545875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:07000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.545889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.545940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.545953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.661 #25 NEW cov: 11909 ft: 14779 corp: 22/585b lim: 40 exec/s: 25 rss: 71Mb L: 32/40 MS: 1 PersAutoDict- DE: "\344\240hi\024\371\031\000"- 00:07:09.661 [2024-04-19 10:25:31.585393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ff0128c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.585417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.661 #26 NEW cov: 11909 ft: 14790 corp: 23/593b lim: 40 exec/s: 26 rss: 71Mb L: 8/40 MS: 1 EraseBytes- 00:07:09.661 [2024-04-19 10:25:31.625940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.625964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.626033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.626046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.626101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.626114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.626165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.626178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.661 #27 NEW cov: 11909 ft: 14843 corp: 24/632b lim: 40 exec/s: 27 rss: 71Mb L: 39/40 MS: 1 ShuffleBytes- 00:07:09.661 [2024-04-19 10:25:31.665741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff0128c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.665764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.665839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000af3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.665853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.661 #28 NEW cov: 11909 ft: 14859 corp: 25/648b lim: 40 exec/s: 28 rss: 71Mb L: 16/40 MS: 1 InsertRepeatedBytes- 00:07:09.661 [2024-04-19 10:25:31.706156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0aff0aff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.706180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.706251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.706265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.706317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.706331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.661 [2024-04-19 10:25:31.706385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.706400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.661 #31 NEW cov: 11909 ft: 14942 corp: 26/683b lim: 40 exec/s: 31 rss: 71Mb L: 35/40 MS: 3 EraseBytes-CopyPart-InsertRepeatedBytes- 00:07:09.661 [2024-04-19 10:25:31.745797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff290128 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.661 [2024-04-19 10:25:31.745826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.661 #32 NEW cov: 11909 ft: 14952 corp: 27/694b lim: 40 exec/s: 32 rss: 71Mb L: 11/40 MS: 1 InsertByte- 00:07:09.921 [2024-04-19 10:25:31.785949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff29014b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.785974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.921 #33 NEW cov: 11909 ft: 14959 corp: 28/705b lim: 40 exec/s: 33 rss: 71Mb L: 11/40 MS: 1 ChangeByte- 00:07:09.921 [2024-04-19 10:25:31.826332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.826356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.826411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.826424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.826476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0128c30a cdw11:f3ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.826489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.921 #34 NEW cov: 11909 ft: 14989 corp: 29/732b lim: 40 exec/s: 34 rss: 71Mb L: 27/40 MS: 1 CrossOver- 00:07:09.921 [2024-04-19 10:25:31.866615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ff0009 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.866639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.866693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.866706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.866758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.866771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.866842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:fffffffb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.866856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.921 #35 NEW cov: 11909 ft: 15018 corp: 30/766b lim: 40 exec/s: 35 rss: 71Mb L: 34/40 MS: 1 ChangeBinInt- 00:07:09.921 [2024-04-19 10:25:31.906721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.906745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.906825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.906840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.906901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.906915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.906967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffff23ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.906979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.921 #36 NEW cov: 11909 ft: 15027 corp: 31/805b lim: 40 exec/s: 36 rss: 71Mb L: 39/40 MS: 1 ChangeByte- 00:07:09.921 [2024-04-19 10:25:31.946681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffff0300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.946704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.946776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.946790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.946847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffbffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.946861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.921 #37 NEW cov: 11909 ft: 15028 corp: 32/829b lim: 40 exec/s: 37 rss: 71Mb L: 24/40 MS: 1 ChangeBinInt- 00:07:09.921 [2024-04-19 10:25:31.986893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.986917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.921 [2024-04-19 10:25:31.986973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.921 [2024-04-19 10:25:31.986986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.922 [2024-04-19 10:25:31.987037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffffbff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.922 [2024-04-19 10:25:31.987050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.922 [2024-04-19 10:25:31.987101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:40ffffff cdw11:fffffffb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.922 [2024-04-19 10:25:31.987114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.922 #38 NEW cov: 11909 ft: 15043 corp: 33/868b lim: 40 exec/s: 38 rss: 72Mb L: 39/40 MS: 1 ChangeByte- 00:07:09.922 [2024-04-19 10:25:32.027061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.922 [2024-04-19 10:25:32.027086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.922 [2024-04-19 10:25:32.027145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.922 [2024-04-19 10:25:32.027158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.922 [2024-04-19 10:25:32.027213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0128c30a cdw11:f3ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.922 [2024-04-19 10:25:32.027226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.922 [2024-04-19 10:25:32.027280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:28c30af3 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.922 [2024-04-19 10:25:32.027293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.181 #39 NEW cov: 11909 ft: 15054 corp: 34/903b lim: 40 exec/s: 39 rss: 72Mb L: 35/40 MS: 1 CopyPart- 00:07:10.181 [2024-04-19 10:25:32.076715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:efff0128 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.181 [2024-04-19 10:25:32.076738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.181 #40 NEW cov: 11909 ft: 15062 corp: 35/913b lim: 40 exec/s: 40 rss: 72Mb L: 10/40 MS: 1 InsertByte- 00:07:10.181 [2024-04-19 10:25:32.117113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.181 [2024-04-19 10:25:32.117137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.181 [2024-04-19 10:25:32.117210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:f5ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.181 [2024-04-19 10:25:32.117224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.181 [2024-04-19 10:25:32.117276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffbffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.181 [2024-04-19 10:25:32.117289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.181 #41 NEW cov: 11909 ft: 15077 corp: 36/937b lim: 40 exec/s: 41 rss: 72Mb L: 24/40 MS: 1 ChangeBinInt- 00:07:10.181 [2024-04-19 10:25:32.156932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff0128c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.181 [2024-04-19 10:25:32.156956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.181 #42 NEW cov: 11909 ft: 15144 corp: 37/946b lim: 40 exec/s: 42 rss: 72Mb L: 9/40 MS: 1 CopyPart- 00:07:10.182 [2024-04-19 10:25:32.197071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ff012828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.197094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.182 #43 NEW cov: 11909 ft: 15152 corp: 38/955b lim: 40 exec/s: 43 rss: 72Mb L: 9/40 MS: 1 CrossOver- 00:07:10.182 [2024-04-19 10:25:32.237652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.237677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.182 [2024-04-19 10:25:32.237734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.237748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.182 [2024-04-19 10:25:32.237799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.237818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.182 [2024-04-19 10:25:32.237872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:fbffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.237886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.182 #44 NEW cov: 11909 ft: 15156 corp: 39/990b lim: 40 exec/s: 44 rss: 72Mb L: 35/40 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:10.182 [2024-04-19 10:25:32.277761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.277785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.182 [2024-04-19 10:25:32.277856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.277870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.182 [2024-04-19 10:25:32.277923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fffffbff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.277936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.182 [2024-04-19 10:25:32.277988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.182 [2024-04-19 10:25:32.278001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.442 #45 NEW cov: 11909 ft: 15167 corp: 40/1028b lim: 40 exec/s: 45 rss: 72Mb L: 38/40 MS: 1 ChangeBit- 00:07:10.442 [2024-04-19 10:25:32.317868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.442 [2024-04-19 10:25:32.317893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.442 [2024-04-19 10:25:32.317949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.442 [2024-04-19 10:25:32.317963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.442 [2024-04-19 10:25:32.318016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aaffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.442 [2024-04-19 10:25:32.318028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.442 [2024-04-19 10:25:32.318081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fbffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.442 [2024-04-19 10:25:32.318094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.442 #46 NEW cov: 11909 ft: 15199 corp: 41/1063b lim: 40 exec/s: 46 rss: 72Mb L: 35/40 MS: 1 EraseBytes- 00:07:10.442 [2024-04-19 10:25:32.357670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:c7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.442 [2024-04-19 10:25:32.357694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.442 [2024-04-19 10:25:32.357767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:fffbffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.442 [2024-04-19 10:25:32.357781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.442 #47 NEW cov: 11909 ft: 15215 corp: 42/1084b lim: 40 exec/s: 23 rss: 72Mb L: 21/40 MS: 1 EraseBytes- 00:07:10.442 #47 DONE cov: 11909 ft: 15215 corp: 42/1084b lim: 40 exec/s: 23 rss: 72Mb 00:07:10.442 ###### Recommended dictionary. ###### 00:07:10.442 "\377\377\377\377\001(\303\363" # Uses: 0 00:07:10.442 "\344\240hi\024\371\031\000" # Uses: 1 00:07:10.442 "\000\000\000\000" # Uses: 0 00:07:10.442 ###### End of recommended dictionary. ###### 00:07:10.442 Done 47 runs in 2 second(s) 00:07:10.442 10:25:32 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:07:10.442 10:25:32 -- ../common.sh@72 -- # (( i++ )) 00:07:10.442 10:25:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:10.442 10:25:32 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:10.442 10:25:32 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:10.442 10:25:32 -- nvmf/run.sh@24 -- # local timen=1 00:07:10.442 10:25:32 -- nvmf/run.sh@25 -- # local core=0x1 00:07:10.442 10:25:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:10.442 10:25:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:10.442 10:25:32 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:10.442 10:25:32 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:10.442 10:25:32 -- nvmf/run.sh@34 -- # printf %02d 13 00:07:10.442 10:25:32 -- nvmf/run.sh@34 -- # port=4413 00:07:10.442 10:25:32 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:10.442 10:25:32 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:10.442 10:25:32 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:10.442 10:25:32 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:10.442 10:25:32 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:10.442 10:25:32 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:07:10.442 [2024-04-19 10:25:32.544923] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:10.442 [2024-04-19 10:25:32.544995] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid204814 ] 00:07:10.702 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.702 [2024-04-19 10:25:32.812358] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.962 [2024-04-19 10:25:32.896405] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.962 [2024-04-19 10:25:32.955348] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:10.962 [2024-04-19 10:25:32.971477] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:10.962 INFO: Running with entropic power schedule (0xFF, 100). 00:07:10.962 INFO: Seed: 1388843400 00:07:10.962 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:10.962 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:10.962 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:10.962 INFO: A corpus is not provided, starting from an empty corpus 00:07:10.962 #2 INITED exec/s: 0 rss: 63Mb 00:07:10.962 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:10.962 This may also happen if the target rejected all inputs we tried so far 00:07:10.962 [2024-04-19 10:25:33.019248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.962 [2024-04-19 10:25:33.019275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.529 NEW_FUNC[1/670]: 0x494210 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:11.529 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:11.529 #13 NEW cov: 11653 ft: 11654 corp: 2/10b lim: 40 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:11.529 [2024-04-19 10:25:33.350107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0900 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.350153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.529 #19 NEW cov: 11783 ft: 12063 corp: 3/19b lim: 40 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:11.529 [2024-04-19 10:25:33.400135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.400161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.529 #20 NEW cov: 11789 ft: 12462 corp: 4/28b lim: 40 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:11.529 [2024-04-19 10:25:33.440229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff01 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.440253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.529 #26 NEW cov: 11874 ft: 12698 corp: 5/36b lim: 40 exec/s: 0 rss: 69Mb L: 8/9 MS: 1 CrossOver- 00:07:11.529 [2024-04-19 10:25:33.480321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.480347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.529 #27 NEW cov: 11874 ft: 12807 corp: 6/44b lim: 40 exec/s: 0 rss: 69Mb L: 8/9 MS: 1 CopyPart- 00:07:11.529 [2024-04-19 10:25:33.520786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.520815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.529 [2024-04-19 10:25:33.520872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.520886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.529 [2024-04-19 10:25:33.520944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.520958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.529 [2024-04-19 10:25:33.521016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.521032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.529 #28 NEW cov: 11874 ft: 13489 corp: 7/82b lim: 40 exec/s: 0 rss: 70Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:11.529 [2024-04-19 10:25:33.570560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff0000ff cdw11:fff4010a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.570586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.529 #29 NEW cov: 11874 ft: 13593 corp: 8/90b lim: 40 exec/s: 0 rss: 70Mb L: 8/38 MS: 1 ShuffleBytes- 00:07:11.529 [2024-04-19 10:25:33.611020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.611046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.529 [2024-04-19 10:25:33.611106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.611120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.529 [2024-04-19 10:25:33.611194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.611207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.529 [2024-04-19 10:25:33.611265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76768989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.529 [2024-04-19 10:25:33.611279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.788 #30 NEW cov: 11874 ft: 13605 corp: 9/128b lim: 40 exec/s: 0 rss: 70Mb L: 38/38 MS: 1 ChangeBinInt- 00:07:11.788 [2024-04-19 10:25:33.661077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000005f4 cdw11:cfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.661102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.788 [2024-04-19 10:25:33.661161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.661175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.788 [2024-04-19 10:25:33.661231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.661245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.788 #40 NEW cov: 11874 ft: 13910 corp: 10/155b lim: 40 exec/s: 0 rss: 70Mb L: 27/38 MS: 5 EraseBytes-ChangeByte-ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:07:11.788 [2024-04-19 10:25:33.700892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff9300ff cdw11:fff4010a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.700916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.788 #41 NEW cov: 11874 ft: 14000 corp: 11/163b lim: 40 exec/s: 0 rss: 70Mb L: 8/38 MS: 1 ChangeByte- 00:07:11.788 [2024-04-19 10:25:33.741564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.741592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.788 [2024-04-19 10:25:33.741650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.741664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.788 [2024-04-19 10:25:33.741720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:7676ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.741733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.788 [2024-04-19 10:25:33.741789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.741803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.788 [2024-04-19 10:25:33.741865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:76767676 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.741878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:11.788 #42 NEW cov: 11874 ft: 14063 corp: 12/203b lim: 40 exec/s: 0 rss: 70Mb L: 40/40 MS: 1 CMP- DE: "\377\036"- 00:07:11.788 [2024-04-19 10:25:33.781632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.788 [2024-04-19 10:25:33.781656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.789 [2024-04-19 10:25:33.781716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.781729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.789 [2024-04-19 10:25:33.781787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:7676ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.781800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.789 [2024-04-19 10:25:33.781877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.781890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.789 [2024-04-19 10:25:33.781947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:76767629 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.781961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:11.789 #43 NEW cov: 11874 ft: 14111 corp: 13/243b lim: 40 exec/s: 0 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:07:11.789 [2024-04-19 10:25:33.831682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:60767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.831707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.789 [2024-04-19 10:25:33.831764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.831784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.789 [2024-04-19 10:25:33.831841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.831855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.789 [2024-04-19 10:25:33.831913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76768989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.831926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.789 #44 NEW cov: 11874 ft: 14134 corp: 14/281b lim: 40 exec/s: 0 rss: 70Mb L: 38/40 MS: 1 ChangeByte- 00:07:11.789 [2024-04-19 10:25:33.881447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff1e cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.789 [2024-04-19 10:25:33.881472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.048 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:12.048 #45 NEW cov: 11897 ft: 14191 corp: 15/292b lim: 40 exec/s: 0 rss: 70Mb L: 11/40 MS: 1 PersAutoDict- DE: "\377\036"- 00:07:12.048 [2024-04-19 10:25:33.921801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000005f4 cdw11:cfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:33.921829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:33.921887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:1b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:33.921901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:33.921957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:33.921971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.048 #46 NEW cov: 11897 ft: 14232 corp: 16/319b lim: 40 exec/s: 0 rss: 70Mb L: 27/40 MS: 1 ChangeBinInt- 00:07:12.048 [2024-04-19 10:25:33.961650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ffff09ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:33.961675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.048 #47 NEW cov: 11897 ft: 14241 corp: 17/328b lim: 40 exec/s: 0 rss: 70Mb L: 9/40 MS: 1 ShuffleBytes- 00:07:12.048 [2024-04-19 10:25:34.001901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000ff1b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.001925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:34.001984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.001998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.048 #48 NEW cov: 11897 ft: 14434 corp: 18/350b lim: 40 exec/s: 48 rss: 71Mb L: 22/40 MS: 1 EraseBytes- 00:07:12.048 [2024-04-19 10:25:34.052080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a3be5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.052107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:34.052166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.052180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.048 #52 NEW cov: 11897 ft: 14472 corp: 19/370b lim: 40 exec/s: 52 rss: 71Mb L: 20/40 MS: 4 ShuffleBytes-InsertByte-InsertByte-InsertRepeatedBytes- 00:07:12.048 [2024-04-19 10:25:34.092467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:60767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.092491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:34.092550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.092564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:34.092620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:7676ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.092634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:34.092690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76768989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.092703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.048 #53 NEW cov: 11897 ft: 14489 corp: 20/408b lim: 40 exec/s: 53 rss: 71Mb L: 38/40 MS: 1 PersAutoDict- DE: "\377\036"- 00:07:12.048 [2024-04-19 10:25:34.142598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.142623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:34.142699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.142713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:34.142772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.142786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.048 [2024-04-19 10:25:34.142847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.048 [2024-04-19 10:25:34.142861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.308 #54 NEW cov: 11897 ft: 14510 corp: 21/447b lim: 40 exec/s: 54 rss: 71Mb L: 39/40 MS: 1 InsertByte- 00:07:12.308 [2024-04-19 10:25:34.182701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:60767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.308 [2024-04-19 10:25:34.182726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.308 [2024-04-19 10:25:34.182786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.308 [2024-04-19 10:25:34.182802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.308 [2024-04-19 10:25:34.182879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:7676ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.308 [2024-04-19 10:25:34.182893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.308 [2024-04-19 10:25:34.182950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76768989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.308 [2024-04-19 10:25:34.182964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.309 #55 NEW cov: 11897 ft: 14520 corp: 22/485b lim: 40 exec/s: 55 rss: 71Mb L: 38/40 MS: 1 ChangeBit- 00:07:12.309 [2024-04-19 10:25:34.232877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:60767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.232901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.232958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.232972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.233029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:7676ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.233042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.233099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76762176 cdw11:76768989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.233113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.309 #56 NEW cov: 11897 ft: 14550 corp: 23/523b lim: 40 exec/s: 56 rss: 71Mb L: 38/40 MS: 1 ChangeByte- 00:07:12.309 [2024-04-19 10:25:34.282758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.282784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.282844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.282859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.309 #61 NEW cov: 11897 ft: 14564 corp: 24/545b lim: 40 exec/s: 61 rss: 71Mb L: 22/40 MS: 5 ChangeBit-CopyPart-ShuffleBytes-PersAutoDict-InsertRepeatedBytes- DE: "\377\036"- 00:07:12.309 [2024-04-19 10:25:34.323178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.323203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.323278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.323291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.323351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:7676ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.323365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.323422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.323435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.323494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:76767629 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.323508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:12.309 #62 NEW cov: 11897 ft: 14605 corp: 25/585b lim: 40 exec/s: 62 rss: 71Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:12.309 [2024-04-19 10:25:34.372990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000ff1b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.373014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.309 [2024-04-19 10:25:34.373090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.309 [2024-04-19 10:25:34.373104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.309 #63 NEW cov: 11897 ft: 14610 corp: 26/608b lim: 40 exec/s: 63 rss: 72Mb L: 23/40 MS: 1 InsertByte- 00:07:12.569 [2024-04-19 10:25:34.423001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff01c700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.423026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.569 #64 NEW cov: 11897 ft: 14693 corp: 27/617b lim: 40 exec/s: 64 rss: 72Mb L: 9/40 MS: 1 ChangeByte- 00:07:12.569 [2024-04-19 10:25:34.463465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:60767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.463490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.569 [2024-04-19 10:25:34.463548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.463561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.569 [2024-04-19 10:25:34.463620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:56767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.463633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.569 [2024-04-19 10:25:34.463692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76768989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.463705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.569 #65 NEW cov: 11897 ft: 14704 corp: 28/655b lim: 40 exec/s: 65 rss: 72Mb L: 38/40 MS: 1 ChangeBit- 00:07:12.569 [2024-04-19 10:25:34.503324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000ff1b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.503351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.569 [2024-04-19 10:25:34.503429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:1effffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.503443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.569 #66 NEW cov: 11897 ft: 14707 corp: 29/678b lim: 40 exec/s: 66 rss: 72Mb L: 23/40 MS: 1 PersAutoDict- DE: "\377\036"- 00:07:12.569 [2024-04-19 10:25:34.543710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a3be5e5 cdw11:e5e5e5e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.543734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.569 [2024-04-19 10:25:34.543793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.543806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.569 [2024-04-19 10:25:34.543884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ff010060 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.543898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.569 [2024-04-19 10:25:34.543966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:767676e5 cdw11:767676e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.543981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.569 #67 NEW cov: 11897 ft: 14732 corp: 30/710b lim: 40 exec/s: 67 rss: 72Mb L: 32/40 MS: 1 CrossOver- 00:07:12.569 [2024-04-19 10:25:34.594023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.569 [2024-04-19 10:25:34.594047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.569 [2024-04-19 10:25:34.594122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.570 [2024-04-19 10:25:34.594137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.570 [2024-04-19 10:25:34.594196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.570 [2024-04-19 10:25:34.594209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.570 [2024-04-19 10:25:34.594269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.570 [2024-04-19 10:25:34.594281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.570 [2024-04-19 10:25:34.594342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:76767629 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.570 [2024-04-19 10:25:34.594355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:12.570 #68 NEW cov: 11897 ft: 14737 corp: 31/750b lim: 40 exec/s: 68 rss: 72Mb L: 40/40 MS: 1 CopyPart- 00:07:12.570 [2024-04-19 10:25:34.633916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:60767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.570 [2024-04-19 10:25:34.633940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.570 [2024-04-19 10:25:34.634015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.570 [2024-04-19 10:25:34.634030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.570 [2024-04-19 10:25:34.634087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:76762f1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.570 [2024-04-19 10:25:34.634100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.570 [2024-04-19 10:25:34.634156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76762176 cdw11:76768989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.570 [2024-04-19 10:25:34.634169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.570 #69 NEW cov: 11897 ft: 14741 corp: 32/788b lim: 40 exec/s: 69 rss: 72Mb L: 38/40 MS: 1 ChangeByte- 00:07:12.829 [2024-04-19 10:25:34.684224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.829 [2024-04-19 10:25:34.684249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.684308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.684322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.684380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:7676ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.684393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.684450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76763676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.684464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.684523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:76767676 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.684536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:12.830 #70 NEW cov: 11897 ft: 14747 corp: 33/828b lim: 40 exec/s: 70 rss: 72Mb L: 40/40 MS: 1 ChangeBit- 00:07:12.830 [2024-04-19 10:25:34.723825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ffff09df SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.723850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.830 #71 NEW cov: 11897 ft: 14751 corp: 34/837b lim: 40 exec/s: 71 rss: 72Mb L: 9/40 MS: 1 ChangeBit- 00:07:12.830 [2024-04-19 10:25:34.763893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ffff09df SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.763917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.830 #72 NEW cov: 11897 ft: 14811 corp: 35/846b lim: 40 exec/s: 72 rss: 72Mb L: 9/40 MS: 1 CrossOver- 00:07:12.830 [2024-04-19 10:25:34.804034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff01 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.804058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.830 #73 NEW cov: 11897 ft: 14812 corp: 36/855b lim: 40 exec/s: 73 rss: 72Mb L: 9/40 MS: 1 InsertByte- 00:07:12.830 [2024-04-19 10:25:34.844622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff8100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.844647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.844722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.844735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.844791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:7676ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.844805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.844888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76763676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.844901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.844959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:76767676 cdw11:0000f40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.844974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:12.830 #74 NEW cov: 11897 ft: 14826 corp: 37/895b lim: 40 exec/s: 74 rss: 72Mb L: 40/40 MS: 1 ChangeBit- 00:07:12.830 [2024-04-19 10:25:34.894335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff00f7 cdw11:09010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.894360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.830 #78 NEW cov: 11897 ft: 14902 corp: 38/910b lim: 40 exec/s: 78 rss: 72Mb L: 15/40 MS: 4 EraseBytes-ChangeBit-CopyPart-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:12.830 [2024-04-19 10:25:34.934783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.934814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.934874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.934888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.934948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:76767676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.934961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.830 [2024-04-19 10:25:34.935019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:76757676 cdw11:76767676 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.830 [2024-04-19 10:25:34.935035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.089 #79 NEW cov: 11897 ft: 14924 corp: 39/948b lim: 40 exec/s: 79 rss: 72Mb L: 38/40 MS: 1 ChangeBinInt- 00:07:13.089 [2024-04-19 10:25:34.974481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff16 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.089 [2024-04-19 10:25:34.974505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.089 #80 NEW cov: 11897 ft: 14935 corp: 40/959b lim: 40 exec/s: 80 rss: 72Mb L: 11/40 MS: 1 ChangeBit- 00:07:13.089 #80 DONE cov: 11897 ft: 14935 corp: 40/959b lim: 40 exec/s: 40 rss: 72Mb 00:07:13.089 ###### Recommended dictionary. ###### 00:07:13.089 "\377\036" # Uses: 4 00:07:13.089 "\001\000\000\000\000\000\000\000" # Uses: 0 00:07:13.089 ###### End of recommended dictionary. ###### 00:07:13.089 Done 80 runs in 2 second(s) 00:07:13.089 10:25:35 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:07:13.089 10:25:35 -- ../common.sh@72 -- # (( i++ )) 00:07:13.089 10:25:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:13.089 10:25:35 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:13.089 10:25:35 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:13.089 10:25:35 -- nvmf/run.sh@24 -- # local timen=1 00:07:13.090 10:25:35 -- nvmf/run.sh@25 -- # local core=0x1 00:07:13.090 10:25:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:13.090 10:25:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:13.090 10:25:35 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:13.090 10:25:35 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:13.090 10:25:35 -- nvmf/run.sh@34 -- # printf %02d 14 00:07:13.090 10:25:35 -- nvmf/run.sh@34 -- # port=4414 00:07:13.090 10:25:35 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:13.090 10:25:35 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:13.090 10:25:35 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:13.090 10:25:35 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:13.090 10:25:35 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:13.090 10:25:35 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:07:13.090 [2024-04-19 10:25:35.167726] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:13.090 [2024-04-19 10:25:35.167798] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid205135 ] 00:07:13.090 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.349 [2024-04-19 10:25:35.423328] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.608 [2024-04-19 10:25:35.505942] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.608 [2024-04-19 10:25:35.564858] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:13.608 [2024-04-19 10:25:35.580995] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:13.608 INFO: Running with entropic power schedule (0xFF, 100). 00:07:13.608 INFO: Seed: 4001811460 00:07:13.608 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:13.608 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:13.608 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:13.608 INFO: A corpus is not provided, starting from an empty corpus 00:07:13.608 #2 INITED exec/s: 0 rss: 63Mb 00:07:13.608 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:13.608 This may also happen if the target rejected all inputs we tried so far 00:07:13.867 NEW_FUNC[1/658]: 0x495dd0 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:13.867 NEW_FUNC[2/658]: 0x4b7290 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:13.867 #5 NEW cov: 11545 ft: 11546 corp: 2/8b lim: 35 exec/s: 0 rss: 69Mb L: 7/7 MS: 3 InsertByte-CMP-CopyPart- DE: "\001s"- 00:07:14.127 NEW_FUNC[1/2]: 0x4b0760 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:14.127 NEW_FUNC[2/2]: 0x1169740 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1547 00:07:14.127 #6 NEW cov: 11732 ft: 12548 corp: 3/22b lim: 35 exec/s: 0 rss: 69Mb L: 14/14 MS: 1 CrossOver- 00:07:14.127 [2024-04-19 10:25:36.026801] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.026874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.127 [2024-04-19 10:25:36.026910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.026931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.127 [2024-04-19 10:25:36.026962] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.026979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.127 NEW_FUNC[1/15]: 0x16e0ec0 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:07:14.127 NEW_FUNC[2/15]: 0x16e1100 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:07:14.127 #8 NEW cov: 11880 ft: 13159 corp: 4/45b lim: 35 exec/s: 0 rss: 69Mb L: 23/23 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:14.127 [2024-04-19 10:25:36.086928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.086963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.127 #9 NEW cov: 11965 ft: 13568 corp: 5/64b lim: 35 exec/s: 0 rss: 69Mb L: 19/23 MS: 1 CrossOver- 00:07:14.127 [2024-04-19 10:25:36.157067] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.157099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.127 [2024-04-19 10:25:36.157134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.157156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.127 #10 NEW cov: 11965 ft: 13847 corp: 6/81b lim: 35 exec/s: 0 rss: 70Mb L: 17/23 MS: 1 EraseBytes- 00:07:14.127 [2024-04-19 10:25:36.227411] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000fc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.227443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.127 [2024-04-19 10:25:36.227479] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.227502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.127 [2024-04-19 10:25:36.227534] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.227551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.127 [2024-04-19 10:25:36.227582] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.127 [2024-04-19 10:25:36.227599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.386 #11 NEW cov: 11965 ft: 14189 corp: 7/114b lim: 35 exec/s: 0 rss: 70Mb L: 33/33 MS: 1 CopyPart- 00:07:14.386 #15 NEW cov: 11965 ft: 14271 corp: 8/123b lim: 35 exec/s: 0 rss: 70Mb L: 9/33 MS: 4 EraseBytes-ChangeByte-CopyPart-CopyPart- 00:07:14.386 #16 NEW cov: 11965 ft: 14400 corp: 9/135b lim: 35 exec/s: 0 rss: 70Mb L: 12/33 MS: 1 EraseBytes- 00:07:14.386 [2024-04-19 10:25:36.407783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000fc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.386 [2024-04-19 10:25:36.407826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.386 [2024-04-19 10:25:36.407878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.386 [2024-04-19 10:25:36.407896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.386 [2024-04-19 10:25:36.407937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.386 [2024-04-19 10:25:36.407954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.386 #17 NEW cov: 11965 ft: 14506 corp: 10/161b lim: 35 exec/s: 0 rss: 70Mb L: 26/33 MS: 1 EraseBytes- 00:07:14.646 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:14.646 #18 NEW cov: 11988 ft: 14578 corp: 11/173b lim: 35 exec/s: 0 rss: 70Mb L: 12/33 MS: 1 PersAutoDict- DE: "\001s"- 00:07:14.646 [2024-04-19 10:25:36.548124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:5 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.646 [2024-04-19 10:25:36.548157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.646 [2024-04-19 10:25:36.548190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.646 [2024-04-19 10:25:36.548214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.646 #19 NEW cov: 11988 ft: 14652 corp: 12/198b lim: 35 exec/s: 0 rss: 70Mb L: 25/33 MS: 1 InsertRepeatedBytes- 00:07:14.646 [2024-04-19 10:25:36.618298] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000fc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.646 [2024-04-19 10:25:36.618330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.646 [2024-04-19 10:25:36.618364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.646 [2024-04-19 10:25:36.618389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.646 [2024-04-19 10:25:36.618420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.646 [2024-04-19 10:25:36.618440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.646 #20 NEW cov: 11988 ft: 14685 corp: 13/224b lim: 35 exec/s: 20 rss: 70Mb L: 26/33 MS: 1 ShuffleBytes- 00:07:14.646 [2024-04-19 10:25:36.688453] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000073 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.646 [2024-04-19 10:25:36.688483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.646 #21 NEW cov: 11988 ft: 14707 corp: 14/238b lim: 35 exec/s: 21 rss: 70Mb L: 14/33 MS: 1 PersAutoDict- DE: "\001s"- 00:07:14.905 #22 NEW cov: 11988 ft: 14722 corp: 15/250b lim: 35 exec/s: 22 rss: 70Mb L: 12/33 MS: 1 ShuffleBytes- 00:07:14.905 [2024-04-19 10:25:36.808743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000fc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.905 [2024-04-19 10:25:36.808773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.905 [2024-04-19 10:25:36.808830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.905 [2024-04-19 10:25:36.808848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.905 [2024-04-19 10:25:36.808878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.905 [2024-04-19 10:25:36.808896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.905 #23 NEW cov: 11988 ft: 14738 corp: 16/276b lim: 35 exec/s: 23 rss: 70Mb L: 26/33 MS: 1 PersAutoDict- DE: "\001s"- 00:07:14.905 [2024-04-19 10:25:36.858915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:5 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.905 [2024-04-19 10:25:36.858946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.905 [2024-04-19 10:25:36.858979] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.905 [2024-04-19 10:25:36.858996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.905 #24 NEW cov: 11988 ft: 14764 corp: 17/301b lim: 35 exec/s: 24 rss: 71Mb L: 25/33 MS: 1 ChangeByte- 00:07:14.905 #25 NEW cov: 11988 ft: 14772 corp: 18/315b lim: 35 exec/s: 25 rss: 71Mb L: 14/33 MS: 1 ShuffleBytes- 00:07:14.905 [2024-04-19 10:25:36.979307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES TIMESTAMP cid:5 cdw10:0000000e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.905 [2024-04-19 10:25:36.979339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.905 [2024-04-19 10:25:36.979371] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:6 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.905 [2024-04-19 10:25:36.979387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.905 [2024-04-19 10:25:36.979418] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.905 [2024-04-19 10:25:36.979450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.164 #26 NEW cov: 11988 ft: 14786 corp: 19/347b lim: 35 exec/s: 26 rss: 71Mb L: 32/33 MS: 1 InsertRepeatedBytes- 00:07:15.164 [2024-04-19 10:25:37.039452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000fc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.164 [2024-04-19 10:25:37.039488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.164 [2024-04-19 10:25:37.039523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.039547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.165 [2024-04-19 10:25:37.039579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.039595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.165 [2024-04-19 10:25:37.039625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.039642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.165 #27 NEW cov: 11988 ft: 14819 corp: 20/381b lim: 35 exec/s: 27 rss: 71Mb L: 34/34 MS: 1 CopyPart- 00:07:15.165 #28 NEW cov: 11988 ft: 14839 corp: 21/388b lim: 35 exec/s: 28 rss: 71Mb L: 7/34 MS: 1 ShuffleBytes- 00:07:15.165 [2024-04-19 10:25:37.149659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000fc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.149692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.165 [2024-04-19 10:25:37.149743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.149760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.165 [2024-04-19 10:25:37.149792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.149818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.165 #29 NEW cov: 11988 ft: 14906 corp: 22/414b lim: 35 exec/s: 29 rss: 71Mb L: 26/34 MS: 1 CMP- DE: "\000\000"- 00:07:15.165 [2024-04-19 10:25:37.219921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.219951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.165 [2024-04-19 10:25:37.219984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.220000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.165 [2024-04-19 10:25:37.220031] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.165 [2024-04-19 10:25:37.220047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.165 #35 NEW cov: 11988 ft: 14965 corp: 23/446b lim: 35 exec/s: 35 rss: 71Mb L: 32/34 MS: 1 InsertRepeatedBytes- 00:07:15.424 [2024-04-19 10:25:37.289989] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000017 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.424 [2024-04-19 10:25:37.290022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.424 #37 NEW cov: 11988 ft: 14986 corp: 24/460b lim: 35 exec/s: 37 rss: 71Mb L: 14/34 MS: 2 EraseBytes-CMP- DE: "=\252t\240\027\371\031\000"- 00:07:15.424 #38 NEW cov: 11988 ft: 15003 corp: 25/469b lim: 35 exec/s: 38 rss: 71Mb L: 9/34 MS: 1 CopyPart- 00:07:15.424 [2024-04-19 10:25:37.400174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NON OPERATIONAL POWER STATE CONFIG cid:4 cdw10:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.424 [2024-04-19 10:25:37.400209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.424 #39 NEW cov: 11988 ft: 15037 corp: 26/478b lim: 35 exec/s: 39 rss: 71Mb L: 9/34 MS: 1 ShuffleBytes- 00:07:15.424 [2024-04-19 10:25:37.450477] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000073 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.424 [2024-04-19 10:25:37.450507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.424 #40 NEW cov: 11988 ft: 15046 corp: 27/499b lim: 35 exec/s: 40 rss: 71Mb L: 21/34 MS: 1 CrossOver- 00:07:15.424 [2024-04-19 10:25:37.520458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000aa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.424 [2024-04-19 10:25:37.520487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.684 #41 NEW cov: 11988 ft: 15103 corp: 28/509b lim: 35 exec/s: 41 rss: 71Mb L: 10/34 MS: 1 EraseBytes- 00:07:15.684 [2024-04-19 10:25:37.590846] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000073 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.684 [2024-04-19 10:25:37.590876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.684 [2024-04-19 10:25:37.590925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.684 [2024-04-19 10:25:37.590940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.684 [2024-04-19 10:25:37.590971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.684 [2024-04-19 10:25:37.590986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.684 #42 NEW cov: 11988 ft: 15122 corp: 29/542b lim: 35 exec/s: 21 rss: 71Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:07:15.684 #42 DONE cov: 11988 ft: 15122 corp: 29/542b lim: 35 exec/s: 21 rss: 71Mb 00:07:15.684 ###### Recommended dictionary. ###### 00:07:15.684 "\001s" # Uses: 3 00:07:15.684 "\000\000" # Uses: 0 00:07:15.684 "=\252t\240\027\371\031\000" # Uses: 0 00:07:15.684 ###### End of recommended dictionary. ###### 00:07:15.684 Done 42 runs in 2 second(s) 00:07:15.684 10:25:37 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:07:15.684 10:25:37 -- ../common.sh@72 -- # (( i++ )) 00:07:15.684 10:25:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:15.684 10:25:37 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:15.684 10:25:37 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:15.684 10:25:37 -- nvmf/run.sh@24 -- # local timen=1 00:07:15.684 10:25:37 -- nvmf/run.sh@25 -- # local core=0x1 00:07:15.684 10:25:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:15.684 10:25:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:15.684 10:25:37 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:15.684 10:25:37 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:15.684 10:25:37 -- nvmf/run.sh@34 -- # printf %02d 15 00:07:15.684 10:25:37 -- nvmf/run.sh@34 -- # port=4415 00:07:15.684 10:25:37 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:15.684 10:25:37 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:15.684 10:25:37 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:15.684 10:25:37 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:15.684 10:25:37 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:15.684 10:25:37 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:07:15.943 [2024-04-19 10:25:37.799130] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:15.943 [2024-04-19 10:25:37.799202] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid205455 ] 00:07:15.943 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.202 [2024-04-19 10:25:38.057325] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.202 [2024-04-19 10:25:38.140500] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.202 [2024-04-19 10:25:38.199461] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:16.202 [2024-04-19 10:25:38.215595] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:16.202 INFO: Running with entropic power schedule (0xFF, 100). 00:07:16.202 INFO: Seed: 2339864745 00:07:16.202 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:16.202 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:16.202 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:16.202 INFO: A corpus is not provided, starting from an empty corpus 00:07:16.202 #2 INITED exec/s: 0 rss: 63Mb 00:07:16.202 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:16.202 This may also happen if the target rejected all inputs we tried so far 00:07:16.203 [2024-04-19 10:25:38.261307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.203 [2024-04-19 10:25:38.261338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.203 [2024-04-19 10:25:38.261397] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.203 [2024-04-19 10:25:38.261412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.203 [2024-04-19 10:25:38.261467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.203 [2024-04-19 10:25:38.261481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.203 [2024-04-19 10:25:38.261537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.203 [2024-04-19 10:25:38.261549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.770 NEW_FUNC[1/670]: 0x497310 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:16.770 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:16.770 #11 NEW cov: 11635 ft: 11636 corp: 2/30b lim: 35 exec/s: 0 rss: 69Mb L: 29/29 MS: 4 CopyPart-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:16.770 [2024-04-19 10:25:38.592032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.592068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.770 [2024-04-19 10:25:38.592129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.592144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.770 [2024-04-19 10:25:38.592206] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.592219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.770 #12 NEW cov: 11765 ft: 12606 corp: 3/54b lim: 35 exec/s: 0 rss: 69Mb L: 24/29 MS: 1 CrossOver- 00:07:16.770 [2024-04-19 10:25:38.642216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.642243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.770 [2024-04-19 10:25:38.642301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.642315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.770 [2024-04-19 10:25:38.642370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.642384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.770 [2024-04-19 10:25:38.642443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.642456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.770 #13 NEW cov: 11771 ft: 12830 corp: 4/83b lim: 35 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CopyPart- 00:07:16.770 NEW_FUNC[1/1]: 0x4b7290 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:16.770 #18 NEW cov: 11870 ft: 13467 corp: 5/92b lim: 35 exec/s: 0 rss: 69Mb L: 9/29 MS: 5 ShuffleBytes-CopyPart-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:16.770 [2024-04-19 10:25:38.722392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.722417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.770 [2024-04-19 10:25:38.722477] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.770 [2024-04-19 10:25:38.722491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.722549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.722561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.722619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.722633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.771 #19 NEW cov: 11870 ft: 13570 corp: 6/121b lim: 35 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CopyPart- 00:07:16.771 [2024-04-19 10:25:38.762601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.762628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.762689] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.762702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.762764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.762778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.762838] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.762853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.762913] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.762927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:16.771 #20 NEW cov: 11870 ft: 13710 corp: 7/156b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:07:16.771 [2024-04-19 10:25:38.802617] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.802642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.802703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.802717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.802775] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.802788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.771 [2024-04-19 10:25:38.802850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.802864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.771 #21 NEW cov: 11870 ft: 13760 corp: 8/185b lim: 35 exec/s: 0 rss: 69Mb L: 29/35 MS: 1 CopyPart- 00:07:16.771 [2024-04-19 10:25:38.842322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.771 [2024-04-19 10:25:38.842346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.771 #26 NEW cov: 11870 ft: 13932 corp: 9/197b lim: 35 exec/s: 0 rss: 70Mb L: 12/35 MS: 5 CrossOver-ChangeBit-InsertByte-InsertByte-CMP- DE: "\001\031\371\035\201$0Z"- 00:07:17.030 [2024-04-19 10:25:38.882892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:38.882916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:38.882975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:38.882989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:38.883048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:38.883061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:38.883121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:38.883139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.030 #27 NEW cov: 11870 ft: 13991 corp: 10/227b lim: 35 exec/s: 0 rss: 70Mb L: 30/35 MS: 1 InsertByte- 00:07:17.030 NEW_FUNC[1/1]: 0x4b0760 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:17.030 #28 NEW cov: 11908 ft: 14101 corp: 11/236b lim: 35 exec/s: 0 rss: 70Mb L: 9/35 MS: 1 PersAutoDict- DE: "\001\031\371\035\201$0Z"- 00:07:17.030 [2024-04-19 10:25:38.963130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:38.963156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:38.963217] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:38.963232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:38.963289] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:38.963303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:38.963361] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:38.963376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.030 #29 NEW cov: 11908 ft: 14203 corp: 12/266b lim: 35 exec/s: 0 rss: 70Mb L: 30/35 MS: 1 ChangeByte- 00:07:17.030 [2024-04-19 10:25:39.003348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.003372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:39.003434] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.003448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:39.003504] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.003517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:39.003576] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.003591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:39.003650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.003664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.030 #30 NEW cov: 11908 ft: 14271 corp: 13/301b lim: 35 exec/s: 0 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:17.030 [2024-04-19 10:25:39.043279] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.043303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:39.043363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.043380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:39.043438] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.043453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.030 [2024-04-19 10:25:39.043513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.030 [2024-04-19 10:25:39.043527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.030 #31 NEW cov: 11908 ft: 14338 corp: 14/331b lim: 35 exec/s: 0 rss: 70Mb L: 30/35 MS: 1 InsertByte- 00:07:17.030 [2024-04-19 10:25:39.093303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.031 [2024-04-19 10:25:39.093327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.031 [2024-04-19 10:25:39.093387] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.031 [2024-04-19 10:25:39.093401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.031 [2024-04-19 10:25:39.093476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.031 [2024-04-19 10:25:39.093490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.031 #32 NEW cov: 11908 ft: 14387 corp: 15/352b lim: 35 exec/s: 0 rss: 70Mb L: 21/35 MS: 1 EraseBytes- 00:07:17.290 [2024-04-19 10:25:39.143729] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.290 [2024-04-19 10:25:39.143755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.290 [2024-04-19 10:25:39.143822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.290 [2024-04-19 10:25:39.143837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.290 [2024-04-19 10:25:39.143897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.290 [2024-04-19 10:25:39.143911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.290 [2024-04-19 10:25:39.143970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.290 [2024-04-19 10:25:39.143985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.290 [2024-04-19 10:25:39.144043] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.290 [2024-04-19 10:25:39.144058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.290 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:17.290 #33 NEW cov: 11931 ft: 14458 corp: 16/387b lim: 35 exec/s: 0 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:17.290 [2024-04-19 10:25:39.193524] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.193549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.193615] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.193630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.193686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.193699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.291 #34 NEW cov: 11931 ft: 14472 corp: 17/408b lim: 35 exec/s: 0 rss: 70Mb L: 21/35 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\030"- 00:07:17.291 [2024-04-19 10:25:39.233648] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.233672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.233734] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.233748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.233806] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.233824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.291 #40 NEW cov: 11931 ft: 14516 corp: 18/434b lim: 35 exec/s: 40 rss: 70Mb L: 26/35 MS: 1 CrossOver- 00:07:17.291 [2024-04-19 10:25:39.273864] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.273889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.273968] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.273982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.274041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.274055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.274117] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.274130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.291 #41 NEW cov: 11931 ft: 14565 corp: 19/463b lim: 35 exec/s: 41 rss: 70Mb L: 29/35 MS: 1 ChangeBinInt- 00:07:17.291 [2024-04-19 10:25:39.314003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.314027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.314103] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.314117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.314175] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.314191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.314253] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.314267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.291 #42 NEW cov: 11931 ft: 14643 corp: 20/497b lim: 35 exec/s: 42 rss: 70Mb L: 34/35 MS: 1 CrossOver- 00:07:17.291 [2024-04-19 10:25:39.354127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.354151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.354212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000136 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.354226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.354301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.354314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.354373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.354387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.291 #43 NEW cov: 11931 ft: 14666 corp: 21/531b lim: 35 exec/s: 43 rss: 70Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:07:17.291 [2024-04-19 10:25:39.394100] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.394124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.394203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.394217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.291 [2024-04-19 10:25:39.394276] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.291 [2024-04-19 10:25:39.394290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.550 #44 NEW cov: 11931 ft: 14679 corp: 22/557b lim: 35 exec/s: 44 rss: 70Mb L: 26/35 MS: 1 ShuffleBytes- 00:07:17.550 #45 NEW cov: 11931 ft: 14697 corp: 23/567b lim: 35 exec/s: 45 rss: 71Mb L: 10/35 MS: 1 InsertByte- 00:07:17.550 [2024-04-19 10:25:39.474604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.550 [2024-04-19 10:25:39.474629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.550 [2024-04-19 10:25:39.474707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.550 [2024-04-19 10:25:39.474721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.550 [2024-04-19 10:25:39.474780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.550 [2024-04-19 10:25:39.474794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.550 [2024-04-19 10:25:39.474862] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.550 [2024-04-19 10:25:39.474876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.550 [2024-04-19 10:25:39.474934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.550 [2024-04-19 10:25:39.474948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.550 #46 NEW cov: 11931 ft: 14707 corp: 24/602b lim: 35 exec/s: 46 rss: 71Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:17.550 [2024-04-19 10:25:39.514195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.550 [2024-04-19 10:25:39.514219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.550 #47 NEW cov: 11931 ft: 14719 corp: 25/614b lim: 35 exec/s: 47 rss: 71Mb L: 12/35 MS: 1 ChangeBinInt- 00:07:17.550 [2024-04-19 10:25:39.554812] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.550 [2024-04-19 10:25:39.554837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.551 [2024-04-19 10:25:39.554916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.554930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.551 [2024-04-19 10:25:39.554991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.555004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.551 [2024-04-19 10:25:39.555068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.555082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.551 [2024-04-19 10:25:39.555142] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.555156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.551 #48 NEW cov: 11931 ft: 14742 corp: 26/649b lim: 35 exec/s: 48 rss: 71Mb L: 35/35 MS: 1 ChangeByte- 00:07:17.551 [2024-04-19 10:25:39.594785] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.594815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.551 [2024-04-19 10:25:39.594890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.594905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.551 [2024-04-19 10:25:39.594966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.594979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.551 [2024-04-19 10:25:39.595040] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.595054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.551 #49 NEW cov: 11931 ft: 14748 corp: 27/680b lim: 35 exec/s: 49 rss: 71Mb L: 31/35 MS: 1 InsertByte- 00:07:17.551 [2024-04-19 10:25:39.634661] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.634686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.551 [2024-04-19 10:25:39.634748] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.551 [2024-04-19 10:25:39.634777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.551 #50 NEW cov: 11931 ft: 14933 corp: 28/697b lim: 35 exec/s: 50 rss: 71Mb L: 17/35 MS: 1 CrossOver- 00:07:17.811 [2024-04-19 10:25:39.674667] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.674691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.811 #51 NEW cov: 11931 ft: 15006 corp: 29/709b lim: 35 exec/s: 51 rss: 71Mb L: 12/35 MS: 1 CMP- DE: "\377~"- 00:07:17.811 [2024-04-19 10:25:39.725075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.725099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.725176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000009c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.725190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.725249] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.725263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.811 #52 NEW cov: 11931 ft: 15009 corp: 30/735b lim: 35 exec/s: 52 rss: 71Mb L: 26/35 MS: 1 InsertRepeatedBytes- 00:07:17.811 #53 NEW cov: 11931 ft: 15070 corp: 31/744b lim: 35 exec/s: 53 rss: 72Mb L: 9/35 MS: 1 ChangeASCIIInt- 00:07:17.811 [2024-04-19 10:25:39.805555] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.805581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.805659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.805673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.805735] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.805748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.805814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.805828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.805901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.805914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.811 #54 NEW cov: 11931 ft: 15074 corp: 32/779b lim: 35 exec/s: 54 rss: 72Mb L: 35/35 MS: 1 CrossOver- 00:07:17.811 [2024-04-19 10:25:39.845236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.845261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.845320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.845334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.811 #55 NEW cov: 11931 ft: 15102 corp: 33/797b lim: 35 exec/s: 55 rss: 72Mb L: 18/35 MS: 1 EraseBytes- 00:07:17.811 [2024-04-19 10:25:39.885483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.885507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.885583] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.885597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.811 [2024-04-19 10:25:39.885655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.811 [2024-04-19 10:25:39.885670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.811 #56 NEW cov: 11931 ft: 15106 corp: 34/818b lim: 35 exec/s: 56 rss: 72Mb L: 21/35 MS: 1 ChangeBit- 00:07:18.069 [2024-04-19 10:25:39.925933] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:39.925957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:39.926018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:39.926033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:39.926092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:39.926106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:39.926165] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:39.926179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:39.926236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:39.926250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.069 #57 NEW cov: 11931 ft: 15120 corp: 35/853b lim: 35 exec/s: 57 rss: 72Mb L: 35/35 MS: 1 PersAutoDict- DE: "\377~"- 00:07:18.069 #58 NEW cov: 11931 ft: 15125 corp: 36/862b lim: 35 exec/s: 58 rss: 72Mb L: 9/35 MS: 1 CrossOver- 00:07:18.069 [2024-04-19 10:25:40.005868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.005894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:40.005956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.005970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:40.006028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.006042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.069 #59 NEW cov: 11931 ft: 15126 corp: 37/885b lim: 35 exec/s: 59 rss: 72Mb L: 23/35 MS: 1 EraseBytes- 00:07:18.069 [2024-04-19 10:25:40.046344] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.046381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:40.046526] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.046542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:40.046601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.046616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:40.046674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.046688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.069 #60 NEW cov: 11931 ft: 15162 corp: 38/920b lim: 35 exec/s: 60 rss: 72Mb L: 35/35 MS: 1 CrossOver- 00:07:18.069 [2024-04-19 10:25:40.096358] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.096387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.069 [2024-04-19 10:25:40.096449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.069 [2024-04-19 10:25:40.096463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.070 [2024-04-19 10:25:40.096519] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.096533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.070 [2024-04-19 10:25:40.096589] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.096603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.070 [2024-04-19 10:25:40.096660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.096674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.070 #61 NEW cov: 11931 ft: 15166 corp: 39/955b lim: 35 exec/s: 61 rss: 72Mb L: 35/35 MS: 1 ChangeBit- 00:07:18.070 [2024-04-19 10:25:40.136382] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.136408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.070 [2024-04-19 10:25:40.136472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.136486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.070 [2024-04-19 10:25:40.136542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.136555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.070 [2024-04-19 10:25:40.136616] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.136630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.070 #62 NEW cov: 11931 ft: 15199 corp: 40/985b lim: 35 exec/s: 62 rss: 72Mb L: 30/35 MS: 1 ChangeBit- 00:07:18.070 [2024-04-19 10:25:40.176176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.176201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.070 [2024-04-19 10:25:40.176260] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.070 [2024-04-19 10:25:40.176275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.328 #63 NEW cov: 11931 ft: 15218 corp: 41/1002b lim: 35 exec/s: 63 rss: 72Mb L: 17/35 MS: 1 ShuffleBytes- 00:07:18.328 [2024-04-19 10:25:40.226308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000045 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.328 [2024-04-19 10:25:40.226332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.328 [2024-04-19 10:25:40.226392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.328 [2024-04-19 10:25:40.226406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.328 #64 pulse cov: 11931 ft: 15231 corp: 41/1002b lim: 35 exec/s: 32 rss: 72Mb 00:07:18.328 #64 NEW cov: 11931 ft: 15231 corp: 42/1017b lim: 35 exec/s: 32 rss: 72Mb L: 15/35 MS: 1 EraseBytes- 00:07:18.328 #64 DONE cov: 11931 ft: 15231 corp: 42/1017b lim: 35 exec/s: 32 rss: 72Mb 00:07:18.328 ###### Recommended dictionary. ###### 00:07:18.329 "\001\031\371\035\201$0Z" # Uses: 1 00:07:18.329 "\000\000\000\000\000\000\000\030" # Uses: 0 00:07:18.329 "\377~" # Uses: 1 00:07:18.329 ###### End of recommended dictionary. ###### 00:07:18.329 Done 64 runs in 2 second(s) 00:07:18.329 10:25:40 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:07:18.329 10:25:40 -- ../common.sh@72 -- # (( i++ )) 00:07:18.329 10:25:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:18.329 10:25:40 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:18.329 10:25:40 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:18.329 10:25:40 -- nvmf/run.sh@24 -- # local timen=1 00:07:18.329 10:25:40 -- nvmf/run.sh@25 -- # local core=0x1 00:07:18.329 10:25:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:18.329 10:25:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:18.329 10:25:40 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:18.329 10:25:40 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:18.329 10:25:40 -- nvmf/run.sh@34 -- # printf %02d 16 00:07:18.329 10:25:40 -- nvmf/run.sh@34 -- # port=4416 00:07:18.329 10:25:40 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:18.329 10:25:40 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:18.329 10:25:40 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:18.329 10:25:40 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:18.329 10:25:40 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:18.329 10:25:40 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:07:18.329 [2024-04-19 10:25:40.427800] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:18.329 [2024-04-19 10:25:40.427879] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid205782 ] 00:07:18.587 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.587 [2024-04-19 10:25:40.690244] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.846 [2024-04-19 10:25:40.776474] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.846 [2024-04-19 10:25:40.836151] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:18.846 [2024-04-19 10:25:40.852285] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:18.846 INFO: Running with entropic power schedule (0xFF, 100). 00:07:18.846 INFO: Seed: 681067061 00:07:18.846 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:18.846 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:18.846 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:18.846 INFO: A corpus is not provided, starting from an empty corpus 00:07:18.846 #2 INITED exec/s: 0 rss: 63Mb 00:07:18.846 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:18.846 This may also happen if the target rejected all inputs we tried so far 00:07:18.846 [2024-04-19 10:25:40.901119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.846 [2024-04-19 10:25:40.901149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.105 NEW_FUNC[1/671]: 0x4987c0 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:19.106 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:19.365 #3 NEW cov: 11739 ft: 11740 corp: 2/31b lim: 105 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:19.365 [2024-04-19 10:25:41.231954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.365 [2024-04-19 10:25:41.232004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.365 #10 NEW cov: 11869 ft: 12280 corp: 3/62b lim: 105 exec/s: 0 rss: 69Mb L: 31/31 MS: 2 CrossOver-CrossOver- 00:07:19.365 [2024-04-19 10:25:41.271889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.365 [2024-04-19 10:25:41.271917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.365 #11 NEW cov: 11875 ft: 12472 corp: 4/93b lim: 105 exec/s: 0 rss: 69Mb L: 31/31 MS: 1 ShuffleBytes- 00:07:19.365 [2024-04-19 10:25:41.312018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.365 [2024-04-19 10:25:41.312044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.365 #12 NEW cov: 11960 ft: 12807 corp: 5/124b lim: 105 exec/s: 0 rss: 69Mb L: 31/31 MS: 1 ChangeBinInt- 00:07:19.365 [2024-04-19 10:25:41.352237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.365 [2024-04-19 10:25:41.352262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.365 [2024-04-19 10:25:41.352302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.365 [2024-04-19 10:25:41.352318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.365 #13 NEW cov: 11960 ft: 13357 corp: 6/182b lim: 105 exec/s: 0 rss: 69Mb L: 58/58 MS: 1 InsertRepeatedBytes- 00:07:19.365 [2024-04-19 10:25:41.392195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:256 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.365 [2024-04-19 10:25:41.392222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.365 #14 NEW cov: 11960 ft: 13472 corp: 7/215b lim: 105 exec/s: 0 rss: 69Mb L: 33/58 MS: 1 CMP- DE: "\001\000"- 00:07:19.365 [2024-04-19 10:25:41.432477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.365 [2024-04-19 10:25:41.432504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.365 [2024-04-19 10:25:41.432545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.365 [2024-04-19 10:25:41.432560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.365 #15 NEW cov: 11960 ft: 13554 corp: 8/273b lim: 105 exec/s: 0 rss: 70Mb L: 58/58 MS: 1 ChangeByte- 00:07:19.633 [2024-04-19 10:25:41.482494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.482520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.633 #16 NEW cov: 11960 ft: 13593 corp: 9/304b lim: 105 exec/s: 0 rss: 70Mb L: 31/58 MS: 1 ShuffleBytes- 00:07:19.633 [2024-04-19 10:25:41.522600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.522625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.633 #17 NEW cov: 11960 ft: 13630 corp: 10/334b lim: 105 exec/s: 0 rss: 70Mb L: 30/58 MS: 1 ChangeBinInt- 00:07:19.633 [2024-04-19 10:25:41.562815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.562841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.633 [2024-04-19 10:25:41.562882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11067877561816684953 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.562898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.633 #18 NEW cov: 11960 ft: 13668 corp: 11/380b lim: 105 exec/s: 0 rss: 70Mb L: 46/58 MS: 1 InsertRepeatedBytes- 00:07:19.633 [2024-04-19 10:25:41.602827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.602854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.633 #19 NEW cov: 11960 ft: 13725 corp: 12/414b lim: 105 exec/s: 0 rss: 70Mb L: 34/58 MS: 1 CrossOver- 00:07:19.633 [2024-04-19 10:25:41.643054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.643081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.633 [2024-04-19 10:25:41.643129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11067877561816684953 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.643146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.633 #20 NEW cov: 11960 ft: 13774 corp: 13/460b lim: 105 exec/s: 0 rss: 70Mb L: 46/58 MS: 1 ChangeBit- 00:07:19.633 [2024-04-19 10:25:41.683043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:11 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.683069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.633 #21 NEW cov: 11960 ft: 13805 corp: 14/485b lim: 105 exec/s: 0 rss: 70Mb L: 25/58 MS: 1 EraseBytes- 00:07:19.633 [2024-04-19 10:25:41.723357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.723382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.633 [2024-04-19 10:25:41.723430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8391460345569637492 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.723446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.633 [2024-04-19 10:25:41.723498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11068045787095734681 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.633 [2024-04-19 10:25:41.723513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.892 #27 NEW cov: 11960 ft: 14144 corp: 15/564b lim: 105 exec/s: 0 rss: 70Mb L: 79/79 MS: 1 CrossOver- 00:07:19.892 [2024-04-19 10:25:41.763261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.763288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.892 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:19.892 #28 NEW cov: 11983 ft: 14151 corp: 16/595b lim: 105 exec/s: 0 rss: 70Mb L: 31/79 MS: 1 ShuffleBytes- 00:07:19.892 [2024-04-19 10:25:41.803366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.803392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.892 #29 NEW cov: 11983 ft: 14163 corp: 17/619b lim: 105 exec/s: 0 rss: 70Mb L: 24/79 MS: 1 CrossOver- 00:07:19.892 [2024-04-19 10:25:41.843474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.843500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.892 #30 NEW cov: 11983 ft: 14188 corp: 18/641b lim: 105 exec/s: 0 rss: 70Mb L: 22/79 MS: 1 EraseBytes- 00:07:19.892 [2024-04-19 10:25:41.883732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.883758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.892 [2024-04-19 10:25:41.883835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.883852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.892 #31 NEW cov: 11983 ft: 14204 corp: 19/696b lim: 105 exec/s: 31 rss: 70Mb L: 55/79 MS: 1 CrossOver- 00:07:19.892 [2024-04-19 10:25:41.923861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.923887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.892 [2024-04-19 10:25:41.923952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.923968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.892 #32 NEW cov: 11983 ft: 14270 corp: 20/751b lim: 105 exec/s: 32 rss: 70Mb L: 55/79 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:19.892 [2024-04-19 10:25:41.963987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.964013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.892 [2024-04-19 10:25:41.964055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11067877561816684953 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.892 [2024-04-19 10:25:41.964071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.892 #33 NEW cov: 11983 ft: 14314 corp: 21/797b lim: 105 exec/s: 33 rss: 70Mb L: 46/79 MS: 1 CopyPart- 00:07:20.152 [2024-04-19 10:25:42.003978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:256 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.152 [2024-04-19 10:25:42.004005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.152 #34 NEW cov: 11983 ft: 14398 corp: 22/830b lim: 105 exec/s: 34 rss: 70Mb L: 33/79 MS: 1 ChangeBit- 00:07:20.152 [2024-04-19 10:25:42.044195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.152 [2024-04-19 10:25:42.044221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.152 [2024-04-19 10:25:42.044258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10923762373740829081 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.152 [2024-04-19 10:25:42.044274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.152 #35 NEW cov: 11983 ft: 14414 corp: 23/876b lim: 105 exec/s: 35 rss: 71Mb L: 46/79 MS: 1 ChangeBinInt- 00:07:20.152 [2024-04-19 10:25:42.084153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.152 [2024-04-19 10:25:42.084180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.153 #36 NEW cov: 11983 ft: 14425 corp: 24/907b lim: 105 exec/s: 36 rss: 71Mb L: 31/79 MS: 1 ChangeBit- 00:07:20.153 [2024-04-19 10:25:42.124333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:11 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.153 [2024-04-19 10:25:42.124358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.153 #37 NEW cov: 11983 ft: 14475 corp: 25/932b lim: 105 exec/s: 37 rss: 71Mb L: 25/79 MS: 1 ChangeBinInt- 00:07:20.153 [2024-04-19 10:25:42.164424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.153 [2024-04-19 10:25:42.164449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.153 #38 NEW cov: 11983 ft: 14511 corp: 26/964b lim: 105 exec/s: 38 rss: 71Mb L: 32/79 MS: 1 CrossOver- 00:07:20.153 [2024-04-19 10:25:42.204510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.153 [2024-04-19 10:25:42.204535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.153 #39 NEW cov: 11983 ft: 14530 corp: 27/986b lim: 105 exec/s: 39 rss: 71Mb L: 22/79 MS: 1 ChangeBinInt- 00:07:20.153 [2024-04-19 10:25:42.244643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:256 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.153 [2024-04-19 10:25:42.244669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.412 #40 NEW cov: 11983 ft: 14531 corp: 28/1019b lim: 105 exec/s: 40 rss: 71Mb L: 33/79 MS: 1 ShuffleBytes- 00:07:20.412 [2024-04-19 10:25:42.285106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:256 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.285131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.412 [2024-04-19 10:25:42.285186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.285200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.412 [2024-04-19 10:25:42.285253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.285269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.412 [2024-04-19 10:25:42.285320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.285335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.412 #41 NEW cov: 11983 ft: 15013 corp: 29/1110b lim: 105 exec/s: 41 rss: 71Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:07:20.412 [2024-04-19 10:25:42.324976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.325001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.412 [2024-04-19 10:25:42.325056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068045787095734681 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.325072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.412 #42 NEW cov: 11983 ft: 15019 corp: 30/1157b lim: 105 exec/s: 42 rss: 71Mb L: 47/91 MS: 1 InsertByte- 00:07:20.412 [2024-04-19 10:25:42.365099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8391460049216894068 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.365124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.412 [2024-04-19 10:25:42.365178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.365199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.412 #48 NEW cov: 11983 ft: 15048 corp: 31/1215b lim: 105 exec/s: 48 rss: 71Mb L: 58/91 MS: 1 ChangeBit- 00:07:20.412 [2024-04-19 10:25:42.405100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.405126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.412 #49 NEW cov: 11983 ft: 15058 corp: 32/1248b lim: 105 exec/s: 49 rss: 71Mb L: 33/91 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:20.412 [2024-04-19 10:25:42.445216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:65280 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.445242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.412 #50 NEW cov: 11983 ft: 15060 corp: 33/1282b lim: 105 exec/s: 50 rss: 72Mb L: 34/91 MS: 1 InsertByte- 00:07:20.412 [2024-04-19 10:25:42.485529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.485554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.412 [2024-04-19 10:25:42.485606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.485621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.412 [2024-04-19 10:25:42.485673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.412 [2024-04-19 10:25:42.485689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.412 #51 NEW cov: 11983 ft: 15071 corp: 34/1351b lim: 105 exec/s: 51 rss: 72Mb L: 69/91 MS: 1 InsertRepeatedBytes- 00:07:20.671 [2024-04-19 10:25:42.525447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:256 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.671 [2024-04-19 10:25:42.525474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.671 #52 NEW cov: 11983 ft: 15089 corp: 35/1384b lim: 105 exec/s: 52 rss: 72Mb L: 33/91 MS: 1 ShuffleBytes- 00:07:20.671 [2024-04-19 10:25:42.565630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.671 [2024-04-19 10:25:42.565655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.671 [2024-04-19 10:25:42.565711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10923762373740829081 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.671 [2024-04-19 10:25:42.565728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.671 #53 NEW cov: 11983 ft: 15094 corp: 36/1430b lim: 105 exec/s: 53 rss: 72Mb L: 46/91 MS: 1 ChangeByte- 00:07:20.671 [2024-04-19 10:25:42.606003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.671 [2024-04-19 10:25:42.606029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.671 [2024-04-19 10:25:42.606096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744071991590911 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.671 [2024-04-19 10:25:42.606112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.671 [2024-04-19 10:25:42.606167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.671 [2024-04-19 10:25:42.606183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.671 [2024-04-19 10:25:42.606235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:11068046445943717887 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.671 [2024-04-19 10:25:42.606251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.671 #54 NEW cov: 11983 ft: 15105 corp: 37/1520b lim: 105 exec/s: 54 rss: 72Mb L: 90/91 MS: 1 InsertRepeatedBytes- 00:07:20.671 [2024-04-19 10:25:42.645878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.672 [2024-04-19 10:25:42.645903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.672 [2024-04-19 10:25:42.645941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2814749767106560 len:2561 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.672 [2024-04-19 10:25:42.645957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.672 #55 NEW cov: 11983 ft: 15162 corp: 38/1573b lim: 105 exec/s: 55 rss: 72Mb L: 53/91 MS: 1 CopyPart- 00:07:20.672 [2024-04-19 10:25:42.686014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.672 [2024-04-19 10:25:42.686039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.672 [2024-04-19 10:25:42.686077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.672 [2024-04-19 10:25:42.686093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.672 #56 NEW cov: 11983 ft: 15171 corp: 39/1628b lim: 105 exec/s: 56 rss: 72Mb L: 55/91 MS: 1 ChangeBinInt- 00:07:20.672 [2024-04-19 10:25:42.725994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.672 [2024-04-19 10:25:42.726020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.672 #60 NEW cov: 11983 ft: 15214 corp: 40/1669b lim: 105 exec/s: 60 rss: 72Mb L: 41/91 MS: 4 CopyPart-InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:20.672 [2024-04-19 10:25:42.766354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.672 [2024-04-19 10:25:42.766379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.672 [2024-04-19 10:25:42.766421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8391460345569637492 len:29813 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.672 [2024-04-19 10:25:42.766437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.672 [2024-04-19 10:25:42.766489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11068045787095734681 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.672 [2024-04-19 10:25:42.766504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.931 #61 NEW cov: 11983 ft: 15232 corp: 41/1748b lim: 105 exec/s: 61 rss: 72Mb L: 79/91 MS: 1 ChangeBit- 00:07:20.931 [2024-04-19 10:25:42.816398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:65280 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.931 [2024-04-19 10:25:42.816428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.931 [2024-04-19 10:25:42.816495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.931 [2024-04-19 10:25:42.816512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.931 #62 NEW cov: 11983 ft: 15240 corp: 42/1791b lim: 105 exec/s: 62 rss: 72Mb L: 43/91 MS: 1 CrossOver- 00:07:20.931 [2024-04-19 10:25:42.856573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.931 [2024-04-19 10:25:42.856598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.931 [2024-04-19 10:25:42.856643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.931 [2024-04-19 10:25:42.856658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.931 [2024-04-19 10:25:42.856726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.931 [2024-04-19 10:25:42.856742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.931 #63 NEW cov: 11983 ft: 15242 corp: 43/1869b lim: 105 exec/s: 63 rss: 72Mb L: 78/91 MS: 1 InsertRepeatedBytes- 00:07:20.931 [2024-04-19 10:25:42.896477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:256 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.931 [2024-04-19 10:25:42.896504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.931 #64 pulse cov: 11983 ft: 15254 corp: 43/1869b lim: 105 exec/s: 32 rss: 72Mb 00:07:20.931 #64 NEW cov: 11983 ft: 15254 corp: 44/1902b lim: 105 exec/s: 32 rss: 72Mb L: 33/91 MS: 1 ChangeByte- 00:07:20.931 #64 DONE cov: 11983 ft: 15254 corp: 44/1902b lim: 105 exec/s: 32 rss: 72Mb 00:07:20.931 ###### Recommended dictionary. ###### 00:07:20.931 "\001\000" # Uses: 4 00:07:20.931 ###### End of recommended dictionary. ###### 00:07:20.931 Done 64 runs in 2 second(s) 00:07:20.931 10:25:43 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:07:20.931 10:25:43 -- ../common.sh@72 -- # (( i++ )) 00:07:20.931 10:25:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:20.931 10:25:43 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:21.191 10:25:43 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:21.191 10:25:43 -- nvmf/run.sh@24 -- # local timen=1 00:07:21.191 10:25:43 -- nvmf/run.sh@25 -- # local core=0x1 00:07:21.191 10:25:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:21.191 10:25:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:21.191 10:25:43 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:21.191 10:25:43 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:21.191 10:25:43 -- nvmf/run.sh@34 -- # printf %02d 17 00:07:21.191 10:25:43 -- nvmf/run.sh@34 -- # port=4417 00:07:21.191 10:25:43 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:21.191 10:25:43 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:21.191 10:25:43 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:21.191 10:25:43 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:21.191 10:25:43 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:21.191 10:25:43 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:07:21.191 [2024-04-19 10:25:43.084405] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:21.191 [2024-04-19 10:25:43.084474] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid206122 ] 00:07:21.191 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.451 [2024-04-19 10:25:43.341837] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.451 [2024-04-19 10:25:43.418932] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.451 [2024-04-19 10:25:43.477904] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:21.451 [2024-04-19 10:25:43.494049] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:21.451 INFO: Running with entropic power schedule (0xFF, 100). 00:07:21.451 INFO: Seed: 3324890109 00:07:21.451 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:21.451 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:21.451 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:21.451 INFO: A corpus is not provided, starting from an empty corpus 00:07:21.451 #2 INITED exec/s: 0 rss: 61Mb 00:07:21.451 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:21.451 This may also happen if the target rejected all inputs we tried so far 00:07:21.451 [2024-04-19 10:25:43.539009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10923366096610927767 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.451 [2024-04-19 10:25:43.539044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.451 [2024-04-19 10:25:43.539079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.451 [2024-04-19 10:25:43.539097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.451 [2024-04-19 10:25:43.539128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.451 [2024-04-19 10:25:43.539144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.451 [2024-04-19 10:25:43.539174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.451 [2024-04-19 10:25:43.539191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.969 NEW_FUNC[1/670]: 0x49bb40 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:21.969 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:21.969 #26 NEW cov: 11757 ft: 11755 corp: 2/100b lim: 120 exec/s: 0 rss: 67Mb L: 99/99 MS: 4 CopyPart-InsertByte-CopyPart-InsertRepeatedBytes- 00:07:21.969 [2024-04-19 10:25:43.879656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.969 [2024-04-19 10:25:43.879701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.969 [2024-04-19 10:25:43.879754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.969 [2024-04-19 10:25:43.879772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.969 NEW_FUNC[1/2]: 0xfa5280 in _sock_flush /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1323 00:07:21.969 NEW_FUNC[2/2]: 0xfa9730 in spdk_sock_prep_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/sock.h:297 00:07:21.969 #28 NEW cov: 11890 ft: 12625 corp: 3/171b lim: 120 exec/s: 0 rss: 68Mb L: 71/99 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:07:21.969 [2024-04-19 10:25:43.939604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.969 [2024-04-19 10:25:43.939638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.969 #30 NEW cov: 11896 ft: 13736 corp: 4/196b lim: 120 exec/s: 0 rss: 68Mb L: 25/99 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:21.969 [2024-04-19 10:25:43.999773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.969 [2024-04-19 10:25:43.999804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.969 #31 NEW cov: 11981 ft: 13979 corp: 5/220b lim: 120 exec/s: 0 rss: 68Mb L: 24/99 MS: 1 InsertRepeatedBytes- 00:07:21.969 [2024-04-19 10:25:44.049914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.969 [2024-04-19 10:25:44.049944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.229 #32 NEW cov: 11981 ft: 14246 corp: 6/244b lim: 120 exec/s: 0 rss: 68Mb L: 24/99 MS: 1 ChangeBinInt- 00:07:22.229 [2024-04-19 10:25:44.120119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.229 [2024-04-19 10:25:44.120148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.229 #37 NEW cov: 11981 ft: 14278 corp: 7/285b lim: 120 exec/s: 0 rss: 68Mb L: 41/99 MS: 5 CopyPart-CrossOver-ChangeBit-ChangeBit-CrossOver- 00:07:22.229 [2024-04-19 10:25:44.170238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709498879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.229 [2024-04-19 10:25:44.170268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.229 #43 NEW cov: 11981 ft: 14366 corp: 8/309b lim: 120 exec/s: 0 rss: 68Mb L: 24/99 MS: 1 ChangeByte- 00:07:22.229 [2024-04-19 10:25:44.230506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.229 [2024-04-19 10:25:44.230537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.229 [2024-04-19 10:25:44.230587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.229 [2024-04-19 10:25:44.230605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.229 [2024-04-19 10:25:44.230636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.229 [2024-04-19 10:25:44.230653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.229 #44 NEW cov: 11981 ft: 14695 corp: 9/381b lim: 120 exec/s: 0 rss: 68Mb L: 72/99 MS: 1 InsertByte- 00:07:22.229 [2024-04-19 10:25:44.300678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.229 [2024-04-19 10:25:44.300708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.229 [2024-04-19 10:25:44.300761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.229 [2024-04-19 10:25:44.300779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.229 [2024-04-19 10:25:44.300816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.229 [2024-04-19 10:25:44.300833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.229 #45 NEW cov: 11981 ft: 14728 corp: 10/453b lim: 120 exec/s: 0 rss: 68Mb L: 72/99 MS: 1 InsertByte- 00:07:22.489 [2024-04-19 10:25:44.350789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:432345564227567616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.350825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.489 [2024-04-19 10:25:44.350875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.350893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.489 [2024-04-19 10:25:44.350924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.350940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.489 #46 NEW cov: 11981 ft: 14878 corp: 11/527b lim: 120 exec/s: 0 rss: 68Mb L: 74/99 MS: 1 CMP- DE: "\000\006"- 00:07:22.489 [2024-04-19 10:25:44.420858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.420888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.489 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:22.489 #52 NEW cov: 12004 ft: 14937 corp: 12/552b lim: 120 exec/s: 0 rss: 68Mb L: 25/99 MS: 1 InsertByte- 00:07:22.489 [2024-04-19 10:25:44.481021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.481051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.489 #53 NEW cov: 12004 ft: 14975 corp: 13/593b lim: 120 exec/s: 53 rss: 68Mb L: 41/99 MS: 1 ChangeBit- 00:07:22.489 [2024-04-19 10:25:44.551384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10923366096610927767 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.551414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.489 [2024-04-19 10:25:44.551462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7523377975951595369 len:27800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.551480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.489 [2024-04-19 10:25:44.551510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.551526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.489 [2024-04-19 10:25:44.551555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.489 [2024-04-19 10:25:44.551575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.748 #54 NEW cov: 12004 ft: 15009 corp: 14/692b lim: 120 exec/s: 54 rss: 68Mb L: 99/99 MS: 1 ChangeBinInt- 00:07:22.748 [2024-04-19 10:25:44.621569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10923366096610927767 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.748 [2024-04-19 10:25:44.621598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.748 [2024-04-19 10:25:44.621646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.748 [2024-04-19 10:25:44.621664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.748 [2024-04-19 10:25:44.621694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.748 [2024-04-19 10:25:44.621710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.748 [2024-04-19 10:25:44.621739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.748 [2024-04-19 10:25:44.621755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.748 #55 NEW cov: 12004 ft: 15108 corp: 15/791b lim: 120 exec/s: 55 rss: 68Mb L: 99/99 MS: 1 ChangeByte- 00:07:22.748 [2024-04-19 10:25:44.671473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1688849860198656 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.748 [2024-04-19 10:25:44.671502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.748 #56 NEW cov: 12004 ft: 15119 corp: 16/819b lim: 120 exec/s: 56 rss: 69Mb L: 28/99 MS: 1 CMP- DE: "\001\000\000\005"- 00:07:22.748 [2024-04-19 10:25:44.741677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10923366096610927767 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.748 [2024-04-19 10:25:44.741707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.748 #57 NEW cov: 12004 ft: 15140 corp: 17/858b lim: 120 exec/s: 57 rss: 69Mb L: 39/99 MS: 1 CrossOver- 00:07:22.748 [2024-04-19 10:25:44.811888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.748 [2024-04-19 10:25:44.811918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.007 #63 NEW cov: 12004 ft: 15154 corp: 18/883b lim: 120 exec/s: 63 rss: 69Mb L: 25/99 MS: 1 ChangeBit- 00:07:23.007 [2024-04-19 10:25:44.882043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.007 [2024-04-19 10:25:44.882072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.007 #64 NEW cov: 12004 ft: 15167 corp: 19/908b lim: 120 exec/s: 64 rss: 69Mb L: 25/99 MS: 1 ChangeBit- 00:07:23.007 [2024-04-19 10:25:44.952432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.007 [2024-04-19 10:25:44.952463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.007 [2024-04-19 10:25:44.952496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.007 [2024-04-19 10:25:44.952513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.007 [2024-04-19 10:25:44.952548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:210006720905216 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.007 [2024-04-19 10:25:44.952565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.007 #65 NEW cov: 12004 ft: 15203 corp: 20/980b lim: 120 exec/s: 65 rss: 69Mb L: 72/99 MS: 1 ChangeByte- 00:07:23.007 [2024-04-19 10:25:45.022382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.007 [2024-04-19 10:25:45.022411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.007 #66 NEW cov: 12004 ft: 15281 corp: 21/1005b lim: 120 exec/s: 66 rss: 69Mb L: 25/99 MS: 1 ChangeByte- 00:07:23.007 [2024-04-19 10:25:45.092706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.008 [2024-04-19 10:25:45.092736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.008 [2024-04-19 10:25:45.092784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.008 [2024-04-19 10:25:45.092802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.008 [2024-04-19 10:25:45.092839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.008 [2024-04-19 10:25:45.092855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.273 #67 NEW cov: 12004 ft: 15322 corp: 22/1077b lim: 120 exec/s: 67 rss: 69Mb L: 72/99 MS: 1 InsertByte- 00:07:23.273 [2024-04-19 10:25:45.142852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:432345564227567616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.142882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.142915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.142932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.142963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.142979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.273 #68 NEW cov: 12004 ft: 15344 corp: 23/1151b lim: 120 exec/s: 68 rss: 69Mb L: 74/99 MS: 1 ShuffleBytes- 00:07:23.273 [2024-04-19 10:25:45.213060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10923366096610927767 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.213090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.213124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.213141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.213187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.213204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.273 #69 NEW cov: 12004 ft: 15356 corp: 24/1239b lim: 120 exec/s: 69 rss: 69Mb L: 88/99 MS: 1 EraseBytes- 00:07:23.273 [2024-04-19 10:25:45.283299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.283329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.283362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.283379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.283409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.283426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.283455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.283471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.273 #70 NEW cov: 12004 ft: 15380 corp: 25/1349b lim: 120 exec/s: 70 rss: 69Mb L: 110/110 MS: 1 InsertRepeatedBytes- 00:07:23.273 [2024-04-19 10:25:45.353479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10923366096610927767 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.353509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.353542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.353559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.273 [2024-04-19 10:25:45.353590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:10923366098549577623 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.273 [2024-04-19 10:25:45.353606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.274 [2024-04-19 10:25:45.353636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:10923366098549577623 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.274 [2024-04-19 10:25:45.353652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.534 #71 NEW cov: 12004 ft: 15409 corp: 26/1458b lim: 120 exec/s: 71 rss: 69Mb L: 109/110 MS: 1 InsertRepeatedBytes- 00:07:23.534 [2024-04-19 10:25:45.403383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709498879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.534 [2024-04-19 10:25:45.403412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.534 #72 NEW cov: 12004 ft: 15413 corp: 27/1483b lim: 120 exec/s: 72 rss: 69Mb L: 25/110 MS: 1 InsertByte- 00:07:23.534 [2024-04-19 10:25:45.473683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.534 [2024-04-19 10:25:45.473712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.534 [2024-04-19 10:25:45.473759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.534 [2024-04-19 10:25:45.473776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.534 [2024-04-19 10:25:45.473817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.534 [2024-04-19 10:25:45.473849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.534 #73 NEW cov: 12004 ft: 15424 corp: 28/1555b lim: 120 exec/s: 73 rss: 70Mb L: 72/110 MS: 1 ShuffleBytes- 00:07:23.534 [2024-04-19 10:25:45.523685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.534 [2024-04-19 10:25:45.523715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.534 #74 NEW cov: 12004 ft: 15437 corp: 29/1580b lim: 120 exec/s: 37 rss: 70Mb L: 25/110 MS: 1 InsertByte- 00:07:23.534 #74 DONE cov: 12004 ft: 15437 corp: 29/1580b lim: 120 exec/s: 37 rss: 70Mb 00:07:23.534 ###### Recommended dictionary. ###### 00:07:23.534 "\000\006" # Uses: 1 00:07:23.534 "\001\000\000\005" # Uses: 1 00:07:23.534 ###### End of recommended dictionary. ###### 00:07:23.534 Done 74 runs in 2 second(s) 00:07:23.793 10:25:45 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:07:23.793 10:25:45 -- ../common.sh@72 -- # (( i++ )) 00:07:23.793 10:25:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:23.793 10:25:45 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:23.793 10:25:45 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:23.793 10:25:45 -- nvmf/run.sh@24 -- # local timen=1 00:07:23.793 10:25:45 -- nvmf/run.sh@25 -- # local core=0x1 00:07:23.793 10:25:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:23.793 10:25:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:23.793 10:25:45 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:23.793 10:25:45 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:23.793 10:25:45 -- nvmf/run.sh@34 -- # printf %02d 18 00:07:23.793 10:25:45 -- nvmf/run.sh@34 -- # port=4418 00:07:23.793 10:25:45 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:23.793 10:25:45 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:23.793 10:25:45 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:23.793 10:25:45 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:23.793 10:25:45 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:23.794 10:25:45 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:07:23.794 [2024-04-19 10:25:45.734584] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:23.794 [2024-04-19 10:25:45.734655] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid206483 ] 00:07:23.794 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.053 [2024-04-19 10:25:45.987001] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.053 [2024-04-19 10:25:46.070361] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.053 [2024-04-19 10:25:46.129440] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:24.053 [2024-04-19 10:25:46.145577] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:24.053 INFO: Running with entropic power schedule (0xFF, 100). 00:07:24.053 INFO: Seed: 1679931975 00:07:24.313 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:24.313 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:24.313 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:24.313 INFO: A corpus is not provided, starting from an empty corpus 00:07:24.313 #2 INITED exec/s: 0 rss: 63Mb 00:07:24.313 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:24.313 This may also happen if the target rejected all inputs we tried so far 00:07:24.313 [2024-04-19 10:25:46.201124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.313 [2024-04-19 10:25:46.201152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.313 [2024-04-19 10:25:46.201191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.313 [2024-04-19 10:25:46.201205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.313 [2024-04-19 10:25:46.201254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.313 [2024-04-19 10:25:46.201268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.313 [2024-04-19 10:25:46.201319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:24.313 [2024-04-19 10:25:46.201333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.573 NEW_FUNC[1/669]: 0x49f430 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:24.573 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:24.573 #6 NEW cov: 11694 ft: 11695 corp: 2/87b lim: 100 exec/s: 0 rss: 69Mb L: 86/86 MS: 4 ShuffleBytes-CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:07:24.573 [2024-04-19 10:25:46.512010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.573 [2024-04-19 10:25:46.512068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.512145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.573 [2024-04-19 10:25:46.512171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.512245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.573 [2024-04-19 10:25:46.512271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.512343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:24.573 [2024-04-19 10:25:46.512368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.573 NEW_FUNC[1/1]: 0x17424e0 in nvme_get_transport /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:56 00:07:24.573 #7 NEW cov: 11833 ft: 12404 corp: 3/173b lim: 100 exec/s: 0 rss: 69Mb L: 86/86 MS: 1 ChangeBit- 00:07:24.573 [2024-04-19 10:25:46.561797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.573 [2024-04-19 10:25:46.561828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.561874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.573 [2024-04-19 10:25:46.561889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.561941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.573 [2024-04-19 10:25:46.561957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.573 #8 NEW cov: 11839 ft: 12878 corp: 4/243b lim: 100 exec/s: 0 rss: 69Mb L: 70/86 MS: 1 EraseBytes- 00:07:24.573 [2024-04-19 10:25:46.601983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.573 [2024-04-19 10:25:46.602008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.602075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.573 [2024-04-19 10:25:46.602090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.602139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.573 [2024-04-19 10:25:46.602153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.602213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:24.573 [2024-04-19 10:25:46.602226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.573 #14 NEW cov: 11924 ft: 13172 corp: 5/337b lim: 100 exec/s: 0 rss: 69Mb L: 94/94 MS: 1 CMP- DE: "\000\031\371\034\232\331\004\\"- 00:07:24.573 [2024-04-19 10:25:46.642030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.573 [2024-04-19 10:25:46.642055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.642102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.573 [2024-04-19 10:25:46.642116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.573 [2024-04-19 10:25:46.642165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.573 [2024-04-19 10:25:46.642179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.573 #15 NEW cov: 11924 ft: 13299 corp: 6/408b lim: 100 exec/s: 0 rss: 70Mb L: 71/94 MS: 1 InsertByte- 00:07:24.573 [2024-04-19 10:25:46.681884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.573 [2024-04-19 10:25:46.681909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.833 #16 NEW cov: 11924 ft: 13702 corp: 7/444b lim: 100 exec/s: 0 rss: 70Mb L: 36/94 MS: 1 CrossOver- 00:07:24.833 [2024-04-19 10:25:46.722319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.833 [2024-04-19 10:25:46.722344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.722393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.833 [2024-04-19 10:25:46.722407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.722455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.833 [2024-04-19 10:25:46.722468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.722520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:24.833 [2024-04-19 10:25:46.722534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.833 #17 NEW cov: 11924 ft: 13758 corp: 8/527b lim: 100 exec/s: 0 rss: 70Mb L: 83/94 MS: 1 InsertRepeatedBytes- 00:07:24.833 [2024-04-19 10:25:46.772460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.833 [2024-04-19 10:25:46.772484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.772533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.833 [2024-04-19 10:25:46.772547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.772611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.833 [2024-04-19 10:25:46.772626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.772677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:24.833 [2024-04-19 10:25:46.772691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.833 #18 NEW cov: 11924 ft: 13877 corp: 9/626b lim: 100 exec/s: 0 rss: 70Mb L: 99/99 MS: 1 CopyPart- 00:07:24.833 [2024-04-19 10:25:46.812552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.833 [2024-04-19 10:25:46.812577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.812629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.833 [2024-04-19 10:25:46.812642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.812691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.833 [2024-04-19 10:25:46.812704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.812754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:24.833 [2024-04-19 10:25:46.812766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.833 #19 NEW cov: 11924 ft: 13887 corp: 10/706b lim: 100 exec/s: 0 rss: 70Mb L: 80/99 MS: 1 InsertRepeatedBytes- 00:07:24.833 [2024-04-19 10:25:46.852702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.833 [2024-04-19 10:25:46.852726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.852795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.833 [2024-04-19 10:25:46.852813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.852863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.833 [2024-04-19 10:25:46.852876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.852939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:24.833 [2024-04-19 10:25:46.852953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.833 #20 NEW cov: 11924 ft: 14029 corp: 11/800b lim: 100 exec/s: 0 rss: 70Mb L: 94/99 MS: 1 PersAutoDict- DE: "\000\031\371\034\232\331\004\\"- 00:07:24.833 [2024-04-19 10:25:46.892489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.833 [2024-04-19 10:25:46.892515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.833 #21 NEW cov: 11924 ft: 14118 corp: 12/837b lim: 100 exec/s: 0 rss: 70Mb L: 37/99 MS: 1 InsertByte- 00:07:24.833 [2024-04-19 10:25:46.932947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:24.833 [2024-04-19 10:25:46.932972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.933038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:24.833 [2024-04-19 10:25:46.933051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.933102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:24.833 [2024-04-19 10:25:46.933116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.833 [2024-04-19 10:25:46.933165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:24.833 [2024-04-19 10:25:46.933180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.093 #22 NEW cov: 11924 ft: 14134 corp: 13/936b lim: 100 exec/s: 0 rss: 70Mb L: 99/99 MS: 1 ShuffleBytes- 00:07:25.093 [2024-04-19 10:25:46.972969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.093 [2024-04-19 10:25:46.972993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.093 [2024-04-19 10:25:46.973041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.093 [2024-04-19 10:25:46.973056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.093 [2024-04-19 10:25:46.973109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.093 [2024-04-19 10:25:46.973122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.093 #23 NEW cov: 11924 ft: 14142 corp: 14/1003b lim: 100 exec/s: 0 rss: 70Mb L: 67/99 MS: 1 CrossOver- 00:07:25.093 [2024-04-19 10:25:47.013176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.093 [2024-04-19 10:25:47.013202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.093 [2024-04-19 10:25:47.013251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.093 [2024-04-19 10:25:47.013265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.093 [2024-04-19 10:25:47.013316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.093 [2024-04-19 10:25:47.013331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.093 [2024-04-19 10:25:47.013379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.093 [2024-04-19 10:25:47.013392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.093 #24 NEW cov: 11924 ft: 14148 corp: 15/1097b lim: 100 exec/s: 0 rss: 70Mb L: 94/99 MS: 1 PersAutoDict- DE: "\000\031\371\034\232\331\004\\"- 00:07:25.093 [2024-04-19 10:25:47.053186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.093 [2024-04-19 10:25:47.053212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.093 [2024-04-19 10:25:47.053263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.093 [2024-04-19 10:25:47.053281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.093 [2024-04-19 10:25:47.053331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.093 [2024-04-19 10:25:47.053346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.094 #25 NEW cov: 11924 ft: 14167 corp: 16/1167b lim: 100 exec/s: 0 rss: 70Mb L: 70/99 MS: 1 CrossOver- 00:07:25.094 [2024-04-19 10:25:47.093378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.094 [2024-04-19 10:25:47.093404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.093454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.094 [2024-04-19 10:25:47.093468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.093519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.094 [2024-04-19 10:25:47.093534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.093585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.094 [2024-04-19 10:25:47.093599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.094 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:25.094 #26 NEW cov: 11947 ft: 14209 corp: 17/1254b lim: 100 exec/s: 0 rss: 70Mb L: 87/99 MS: 1 InsertByte- 00:07:25.094 [2024-04-19 10:25:47.133523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.094 [2024-04-19 10:25:47.133551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.133604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.094 [2024-04-19 10:25:47.133619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.133673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.094 [2024-04-19 10:25:47.133686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.133739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.094 [2024-04-19 10:25:47.133754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.094 #27 NEW cov: 11947 ft: 14258 corp: 18/1334b lim: 100 exec/s: 0 rss: 70Mb L: 80/99 MS: 1 InsertRepeatedBytes- 00:07:25.094 [2024-04-19 10:25:47.183635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.094 [2024-04-19 10:25:47.183661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.183707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.094 [2024-04-19 10:25:47.183721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.183770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.094 [2024-04-19 10:25:47.183782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.094 [2024-04-19 10:25:47.183836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.094 [2024-04-19 10:25:47.183853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.094 #28 NEW cov: 11947 ft: 14338 corp: 19/1428b lim: 100 exec/s: 28 rss: 70Mb L: 94/99 MS: 1 ChangeByte- 00:07:25.354 [2024-04-19 10:25:47.223758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.354 [2024-04-19 10:25:47.223784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.223838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.354 [2024-04-19 10:25:47.223852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.223902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.354 [2024-04-19 10:25:47.223914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.223965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.354 [2024-04-19 10:25:47.223978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.354 #29 NEW cov: 11947 ft: 14353 corp: 20/1522b lim: 100 exec/s: 29 rss: 70Mb L: 94/99 MS: 1 ChangeBinInt- 00:07:25.354 [2024-04-19 10:25:47.263841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.354 [2024-04-19 10:25:47.263865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.263920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.354 [2024-04-19 10:25:47.263934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.263985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.354 [2024-04-19 10:25:47.263998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.264049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.354 [2024-04-19 10:25:47.264063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.354 #30 NEW cov: 11947 ft: 14357 corp: 21/1608b lim: 100 exec/s: 30 rss: 70Mb L: 86/99 MS: 1 ChangeBit- 00:07:25.354 [2024-04-19 10:25:47.303969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.354 [2024-04-19 10:25:47.303993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.304070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.354 [2024-04-19 10:25:47.304100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.304150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.354 [2024-04-19 10:25:47.304163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.304213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.354 [2024-04-19 10:25:47.304227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.354 #31 NEW cov: 11947 ft: 14367 corp: 22/1705b lim: 100 exec/s: 31 rss: 71Mb L: 97/99 MS: 1 InsertRepeatedBytes- 00:07:25.354 [2024-04-19 10:25:47.344070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.354 [2024-04-19 10:25:47.344095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.344148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.354 [2024-04-19 10:25:47.344162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.344212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.354 [2024-04-19 10:25:47.344226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.344275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.354 [2024-04-19 10:25:47.344289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.354 #32 NEW cov: 11947 ft: 14383 corp: 23/1804b lim: 100 exec/s: 32 rss: 71Mb L: 99/99 MS: 1 CMP- DE: "\377\007"- 00:07:25.354 [2024-04-19 10:25:47.384170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.354 [2024-04-19 10:25:47.384194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.384249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.354 [2024-04-19 10:25:47.384263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.384313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.354 [2024-04-19 10:25:47.384327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.384379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.354 [2024-04-19 10:25:47.384394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.354 #33 NEW cov: 11947 ft: 14395 corp: 24/1903b lim: 100 exec/s: 33 rss: 71Mb L: 99/99 MS: 1 ChangeBit- 00:07:25.354 [2024-04-19 10:25:47.424150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.354 [2024-04-19 10:25:47.424174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.424224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.354 [2024-04-19 10:25:47.424238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.424289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.354 [2024-04-19 10:25:47.424303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.354 #34 NEW cov: 11947 ft: 14409 corp: 25/1974b lim: 100 exec/s: 34 rss: 71Mb L: 71/99 MS: 1 ChangeByte- 00:07:25.354 [2024-04-19 10:25:47.464428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.354 [2024-04-19 10:25:47.464453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.464503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.354 [2024-04-19 10:25:47.464518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.354 [2024-04-19 10:25:47.464575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.614 [2024-04-19 10:25:47.464590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.464641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.614 [2024-04-19 10:25:47.464655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.614 #35 NEW cov: 11947 ft: 14420 corp: 26/2060b lim: 100 exec/s: 35 rss: 71Mb L: 86/99 MS: 1 ChangeBinInt- 00:07:25.614 [2024-04-19 10:25:47.514516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.614 [2024-04-19 10:25:47.514540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.514592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.614 [2024-04-19 10:25:47.514604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.514656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.614 [2024-04-19 10:25:47.514669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.514718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.614 [2024-04-19 10:25:47.514732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.614 #41 NEW cov: 11947 ft: 14443 corp: 27/2159b lim: 100 exec/s: 41 rss: 72Mb L: 99/99 MS: 1 CMP- DE: "\377\036"- 00:07:25.614 [2024-04-19 10:25:47.554653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.614 [2024-04-19 10:25:47.554678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.554756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.614 [2024-04-19 10:25:47.554787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.554843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.614 [2024-04-19 10:25:47.554856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.554909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.614 [2024-04-19 10:25:47.554922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.614 #42 NEW cov: 11947 ft: 14448 corp: 28/2248b lim: 100 exec/s: 42 rss: 72Mb L: 89/99 MS: 1 EraseBytes- 00:07:25.614 [2024-04-19 10:25:47.594635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.614 [2024-04-19 10:25:47.594659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.594695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.614 [2024-04-19 10:25:47.594709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.594761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.614 [2024-04-19 10:25:47.594774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.614 #43 NEW cov: 11947 ft: 14459 corp: 29/2326b lim: 100 exec/s: 43 rss: 72Mb L: 78/99 MS: 1 EraseBytes- 00:07:25.614 [2024-04-19 10:25:47.634825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.614 [2024-04-19 10:25:47.634849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.634905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.614 [2024-04-19 10:25:47.634919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.634970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.614 [2024-04-19 10:25:47.634983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.635034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.614 [2024-04-19 10:25:47.635047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.614 #44 NEW cov: 11947 ft: 14481 corp: 30/2421b lim: 100 exec/s: 44 rss: 72Mb L: 95/99 MS: 1 CopyPart- 00:07:25.614 [2024-04-19 10:25:47.674972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.614 [2024-04-19 10:25:47.674996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.675050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.614 [2024-04-19 10:25:47.675064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.675116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.614 [2024-04-19 10:25:47.675130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.675181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.614 [2024-04-19 10:25:47.675194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.614 #45 NEW cov: 11947 ft: 14485 corp: 31/2520b lim: 100 exec/s: 45 rss: 72Mb L: 99/99 MS: 1 ChangeBit- 00:07:25.614 [2024-04-19 10:25:47.715097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.614 [2024-04-19 10:25:47.715121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.715171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.614 [2024-04-19 10:25:47.715185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.715234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.614 [2024-04-19 10:25:47.715248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.614 [2024-04-19 10:25:47.715299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.614 [2024-04-19 10:25:47.715313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.874 #46 NEW cov: 11947 ft: 14490 corp: 32/2619b lim: 100 exec/s: 46 rss: 72Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:07:25.874 [2024-04-19 10:25:47.755054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.874 [2024-04-19 10:25:47.755079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.755130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.874 [2024-04-19 10:25:47.755145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.755198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.874 [2024-04-19 10:25:47.755212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.874 #47 NEW cov: 11947 ft: 14495 corp: 33/2690b lim: 100 exec/s: 47 rss: 72Mb L: 71/99 MS: 1 ChangeBit- 00:07:25.874 [2024-04-19 10:25:47.795159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.874 [2024-04-19 10:25:47.795183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.795232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.874 [2024-04-19 10:25:47.795247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.795299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.874 [2024-04-19 10:25:47.795312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.874 #48 NEW cov: 11947 ft: 14507 corp: 34/2761b lim: 100 exec/s: 48 rss: 72Mb L: 71/99 MS: 1 ChangeBit- 00:07:25.874 [2024-04-19 10:25:47.835430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.874 [2024-04-19 10:25:47.835455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.835520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.874 [2024-04-19 10:25:47.835534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.835585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.874 [2024-04-19 10:25:47.835599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.835652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.874 [2024-04-19 10:25:47.835665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.874 #49 NEW cov: 11947 ft: 14514 corp: 35/2855b lim: 100 exec/s: 49 rss: 72Mb L: 94/99 MS: 1 CopyPart- 00:07:25.874 [2024-04-19 10:25:47.875379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.874 [2024-04-19 10:25:47.875404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.875451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.874 [2024-04-19 10:25:47.875465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.875517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.874 [2024-04-19 10:25:47.875530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.874 #50 NEW cov: 11947 ft: 14540 corp: 36/2925b lim: 100 exec/s: 50 rss: 72Mb L: 70/99 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:25.874 [2024-04-19 10:25:47.915657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.874 [2024-04-19 10:25:47.915681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.915724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.874 [2024-04-19 10:25:47.915739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.915790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.874 [2024-04-19 10:25:47.915803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.915861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.874 [2024-04-19 10:25:47.915876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.874 #51 NEW cov: 11947 ft: 14555 corp: 37/3019b lim: 100 exec/s: 51 rss: 73Mb L: 94/99 MS: 1 PersAutoDict- DE: "\377\036"- 00:07:25.874 [2024-04-19 10:25:47.955902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.874 [2024-04-19 10:25:47.955926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.955981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.874 [2024-04-19 10:25:47.955995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.956046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.874 [2024-04-19 10:25:47.956060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.956113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.874 [2024-04-19 10:25:47.956127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.874 [2024-04-19 10:25:47.956176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:25.874 [2024-04-19 10:25:47.956190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:25.874 #52 NEW cov: 11947 ft: 14620 corp: 38/3119b lim: 100 exec/s: 52 rss: 73Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:07:26.135 [2024-04-19 10:25:47.995877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.135 [2024-04-19 10:25:47.995902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:47.995971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.135 [2024-04-19 10:25:47.995986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:47.996037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.135 [2024-04-19 10:25:47.996051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:47.996104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.135 [2024-04-19 10:25:47.996118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.135 #53 NEW cov: 11947 ft: 14670 corp: 39/3213b lim: 100 exec/s: 53 rss: 73Mb L: 94/100 MS: 1 CopyPart- 00:07:26.135 [2024-04-19 10:25:48.036024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.135 [2024-04-19 10:25:48.036051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:48.036104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.135 [2024-04-19 10:25:48.036117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:48.036168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.135 [2024-04-19 10:25:48.036182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:48.036234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.135 [2024-04-19 10:25:48.036248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.135 #54 NEW cov: 11947 ft: 14727 corp: 40/3293b lim: 100 exec/s: 54 rss: 73Mb L: 80/100 MS: 1 PersAutoDict- DE: "\377\036"- 00:07:26.135 [2024-04-19 10:25:48.076145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.135 [2024-04-19 10:25:48.076172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:48.076209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.135 [2024-04-19 10:25:48.076223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:48.076274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.135 [2024-04-19 10:25:48.076287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:48.076337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.135 [2024-04-19 10:25:48.076352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.135 #60 NEW cov: 11947 ft: 14730 corp: 41/3380b lim: 100 exec/s: 60 rss: 73Mb L: 87/100 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:26.135 [2024-04-19 10:25:48.116207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.135 [2024-04-19 10:25:48.116234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.135 [2024-04-19 10:25:48.116282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.136 [2024-04-19 10:25:48.116296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.136 [2024-04-19 10:25:48.116364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.136 [2024-04-19 10:25:48.116378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.136 [2024-04-19 10:25:48.116429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.136 [2024-04-19 10:25:48.116442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.136 #61 NEW cov: 11947 ft: 14734 corp: 42/3460b lim: 100 exec/s: 61 rss: 73Mb L: 80/100 MS: 1 ChangeBinInt- 00:07:26.136 [2024-04-19 10:25:48.156225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.136 [2024-04-19 10:25:48.156249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.136 [2024-04-19 10:25:48.156305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.136 [2024-04-19 10:25:48.156319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.136 [2024-04-19 10:25:48.156385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.136 [2024-04-19 10:25:48.156399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.136 [2024-04-19 10:25:48.196193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.136 [2024-04-19 10:25:48.196218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.136 [2024-04-19 10:25:48.196254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.136 [2024-04-19 10:25:48.196268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.136 #63 NEW cov: 11947 ft: 14976 corp: 43/3518b lim: 100 exec/s: 31 rss: 73Mb L: 58/100 MS: 2 ChangeBinInt-CrossOver- 00:07:26.136 #63 DONE cov: 11947 ft: 14976 corp: 43/3518b lim: 100 exec/s: 31 rss: 73Mb 00:07:26.136 ###### Recommended dictionary. ###### 00:07:26.136 "\000\031\371\034\232\331\004\\" # Uses: 3 00:07:26.136 "\377\007" # Uses: 2 00:07:26.136 "\377\036" # Uses: 2 00:07:26.136 "\000\000\000\000" # Uses: 1 00:07:26.136 ###### End of recommended dictionary. ###### 00:07:26.136 Done 63 runs in 2 second(s) 00:07:26.395 10:25:48 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:07:26.395 10:25:48 -- ../common.sh@72 -- # (( i++ )) 00:07:26.395 10:25:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:26.395 10:25:48 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:26.395 10:25:48 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:26.395 10:25:48 -- nvmf/run.sh@24 -- # local timen=1 00:07:26.395 10:25:48 -- nvmf/run.sh@25 -- # local core=0x1 00:07:26.395 10:25:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:26.395 10:25:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:26.395 10:25:48 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:26.395 10:25:48 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:26.395 10:25:48 -- nvmf/run.sh@34 -- # printf %02d 19 00:07:26.395 10:25:48 -- nvmf/run.sh@34 -- # port=4419 00:07:26.395 10:25:48 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:26.395 10:25:48 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:26.395 10:25:48 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:26.395 10:25:48 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:26.395 10:25:48 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:26.395 10:25:48 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:07:26.395 [2024-04-19 10:25:48.388095] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:26.395 [2024-04-19 10:25:48.388177] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid206842 ] 00:07:26.395 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.654 [2024-04-19 10:25:48.643545] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.654 [2024-04-19 10:25:48.726989] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.913 [2024-04-19 10:25:48.785959] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:26.913 [2024-04-19 10:25:48.802098] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:26.913 INFO: Running with entropic power schedule (0xFF, 100). 00:07:26.913 INFO: Seed: 42948143 00:07:26.913 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:26.913 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:26.913 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:26.913 INFO: A corpus is not provided, starting from an empty corpus 00:07:26.913 #2 INITED exec/s: 0 rss: 63Mb 00:07:26.913 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:26.913 This may also happen if the target rejected all inputs we tried so far 00:07:26.913 [2024-04-19 10:25:48.846960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583998975 len:48831 00:07:26.913 [2024-04-19 10:25:48.846995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.913 [2024-04-19 10:25:48.847045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839234567870 len:48831 00:07:26.913 [2024-04-19 10:25:48.847063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.913 [2024-04-19 10:25:48.847093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:26.913 [2024-04-19 10:25:48.847109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.913 [2024-04-19 10:25:48.847137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:26.913 [2024-04-19 10:25:48.847153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.172 NEW_FUNC[1/670]: 0x4a23f0 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:27.172 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:27.172 #57 NEW cov: 11681 ft: 11682 corp: 2/49b lim: 50 exec/s: 0 rss: 69Mb L: 48/48 MS: 5 CrossOver-CopyPart-CMP-CrossOver-InsertRepeatedBytes- DE: "\377\377\377~"- 00:07:27.172 [2024-04-19 10:25:49.187785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13694038808650776575 len:48831 00:07:27.172 [2024-04-19 10:25:49.187835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.172 [2024-04-19 10:25:49.187886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839234567870 len:48831 00:07:27.172 [2024-04-19 10:25:49.187905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.172 [2024-04-19 10:25:49.187933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.172 [2024-04-19 10:25:49.187949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.172 [2024-04-19 10:25:49.187977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.172 [2024-04-19 10:25:49.187993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.172 #58 NEW cov: 11811 ft: 12152 corp: 3/97b lim: 50 exec/s: 0 rss: 69Mb L: 48/48 MS: 1 ShuffleBytes- 00:07:27.173 [2024-04-19 10:25:49.257846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583998975 len:48831 00:07:27.173 [2024-04-19 10:25:49.257879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.173 [2024-04-19 10:25:49.257930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839234567870 len:48831 00:07:27.173 [2024-04-19 10:25:49.257948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.173 [2024-04-19 10:25:49.257977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.173 [2024-04-19 10:25:49.257993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.173 [2024-04-19 10:25:49.258021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.173 [2024-04-19 10:25:49.258037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.432 #59 NEW cov: 11817 ft: 12549 corp: 4/145b lim: 50 exec/s: 0 rss: 69Mb L: 48/48 MS: 1 ChangeBinInt- 00:07:27.432 [2024-04-19 10:25:49.307994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13702213956776165375 len:48831 00:07:27.432 [2024-04-19 10:25:49.308026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.432 [2024-04-19 10:25:49.308059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.308076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.308106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.308122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.308150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.308166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.433 #60 NEW cov: 11902 ft: 12765 corp: 5/194b lim: 50 exec/s: 0 rss: 69Mb L: 49/49 MS: 1 InsertByte- 00:07:27.433 [2024-04-19 10:25:49.378161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583998975 len:48831 00:07:27.433 [2024-04-19 10:25:49.378190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.378237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:27.433 [2024-04-19 10:25:49.378255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.378285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.378301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.378330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.378346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.433 #61 NEW cov: 11902 ft: 12871 corp: 6/242b lim: 50 exec/s: 0 rss: 69Mb L: 48/49 MS: 1 ChangeBit- 00:07:27.433 [2024-04-19 10:25:49.428255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583998975 len:48831 00:07:27.433 [2024-04-19 10:25:49.428284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.428335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:27.433 [2024-04-19 10:25:49.428352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.428382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.428398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.428426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.428441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.433 #62 NEW cov: 11902 ft: 12967 corp: 7/290b lim: 50 exec/s: 0 rss: 70Mb L: 48/49 MS: 1 ShuffleBytes- 00:07:27.433 [2024-04-19 10:25:49.498497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446473636983324350 len:48831 00:07:27.433 [2024-04-19 10:25:49.498527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.498574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.498591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.498622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.498638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.498665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.498682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.433 [2024-04-19 10:25:49.498710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:13744632839234567870 len:48831 00:07:27.433 [2024-04-19 10:25:49.498726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:27.433 #63 NEW cov: 11902 ft: 13075 corp: 8/340b lim: 50 exec/s: 0 rss: 70Mb L: 50/50 MS: 1 CrossOver- 00:07:27.693 [2024-04-19 10:25:49.548615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583945215 len:48831 00:07:27.693 [2024-04-19 10:25:49.548646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.548680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:27.693 [2024-04-19 10:25:49.548698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.548729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.693 [2024-04-19 10:25:49.548746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.548775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.693 [2024-04-19 10:25:49.548792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.693 #64 NEW cov: 11902 ft: 13088 corp: 9/388b lim: 50 exec/s: 0 rss: 70Mb L: 48/50 MS: 1 ChangeByte- 00:07:27.693 [2024-04-19 10:25:49.598696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446473636983324350 len:48651 00:07:27.693 [2024-04-19 10:25:49.598728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.693 #65 NEW cov: 11902 ft: 13519 corp: 10/400b lim: 50 exec/s: 0 rss: 70Mb L: 12/50 MS: 1 CrossOver- 00:07:27.693 [2024-04-19 10:25:49.678992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723672161012678655 len:65407 00:07:27.693 [2024-04-19 10:25:49.679025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.679058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:27.693 [2024-04-19 10:25:49.679076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.679107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.693 [2024-04-19 10:25:49.679123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.679153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.693 [2024-04-19 10:25:49.679169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.693 #66 NEW cov: 11902 ft: 13561 corp: 11/448b lim: 50 exec/s: 0 rss: 70Mb L: 48/50 MS: 1 PersAutoDict- DE: "\377\377\377~"- 00:07:27.693 [2024-04-19 10:25:49.729043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723672161012678655 len:65407 00:07:27.693 [2024-04-19 10:25:49.729074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.729121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744704586529029822 len:65407 00:07:27.693 [2024-04-19 10:25:49.729138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.729168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.693 [2024-04-19 10:25:49.729185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.729214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.693 [2024-04-19 10:25:49.729230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.693 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:27.693 #67 NEW cov: 11925 ft: 13620 corp: 12/496b lim: 50 exec/s: 0 rss: 70Mb L: 48/50 MS: 1 PersAutoDict- DE: "\377\377\377~"- 00:07:27.693 [2024-04-19 10:25:49.799288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583945215 len:48831 00:07:27.693 [2024-04-19 10:25:49.799320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.799354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:27.693 [2024-04-19 10:25:49.799377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.799408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.693 [2024-04-19 10:25:49.799424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.693 [2024-04-19 10:25:49.799454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:360287973389811390 len:1 00:07:27.693 [2024-04-19 10:25:49.799471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.951 #68 NEW cov: 11925 ft: 13658 corp: 13/544b lim: 50 exec/s: 68 rss: 70Mb L: 48/50 MS: 1 CMP- DE: "\005\000\000\000\000\000\000\000"- 00:07:27.951 [2024-04-19 10:25:49.869306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446473636983324350 len:48651 00:07:27.952 [2024-04-19 10:25:49.869336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.952 #69 NEW cov: 11925 ft: 13701 corp: 14/556b lim: 50 exec/s: 69 rss: 70Mb L: 12/50 MS: 1 ChangeBinInt- 00:07:27.952 [2024-04-19 10:25:49.939570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723599881008054271 len:15807 00:07:27.952 [2024-04-19 10:25:49.939600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:49.939646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839234567870 len:48831 00:07:27.952 [2024-04-19 10:25:49.939663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:49.939693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.952 [2024-04-19 10:25:49.939709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:49.939737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.952 [2024-04-19 10:25:49.939753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.952 #70 NEW cov: 11925 ft: 13753 corp: 15/604b lim: 50 exec/s: 70 rss: 70Mb L: 48/50 MS: 1 ChangeBinInt- 00:07:27.952 [2024-04-19 10:25:49.999749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583945215 len:48831 00:07:27.952 [2024-04-19 10:25:49.999778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:49.999830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:27.952 [2024-04-19 10:25:49.999848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:49.999878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.952 [2024-04-19 10:25:49.999894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:49.999922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48704 00:07:27.952 [2024-04-19 10:25:49.999938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.952 #71 NEW cov: 11925 ft: 13758 corp: 16/652b lim: 50 exec/s: 71 rss: 70Mb L: 48/50 MS: 1 ChangeByte- 00:07:27.952 [2024-04-19 10:25:50.060057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723672161012678655 len:65407 00:07:27.952 [2024-04-19 10:25:50.060095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:50.060133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744704586529029822 len:65407 00:07:27.952 [2024-04-19 10:25:50.060154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:50.060189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:27.952 [2024-04-19 10:25:50.060208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.952 [2024-04-19 10:25:50.060241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:27.952 [2024-04-19 10:25:50.060260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.210 #72 NEW cov: 11925 ft: 13795 corp: 17/696b lim: 50 exec/s: 72 rss: 70Mb L: 44/50 MS: 1 EraseBytes- 00:07:28.210 [2024-04-19 10:25:50.130080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583945215 len:48831 00:07:28.210 [2024-04-19 10:25:50.130111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.130158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:28.210 [2024-04-19 10:25:50.130176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.130206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.210 [2024-04-19 10:25:50.130222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.130250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:360471591831649982 len:1 00:07:28.210 [2024-04-19 10:25:50.130267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.210 #73 NEW cov: 11925 ft: 13822 corp: 18/744b lim: 50 exec/s: 73 rss: 70Mb L: 48/50 MS: 1 ChangeByte- 00:07:28.210 [2024-04-19 10:25:50.190215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583945215 len:48831 00:07:28.210 [2024-04-19 10:25:50.190246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.190293] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744648232263139006 len:48831 00:07:28.210 [2024-04-19 10:25:50.190310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.190340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.210 [2024-04-19 10:25:50.190356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.190385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:28.210 [2024-04-19 10:25:50.190401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.210 #74 NEW cov: 11925 ft: 13842 corp: 19/792b lim: 50 exec/s: 74 rss: 70Mb L: 48/50 MS: 1 ChangeByte- 00:07:28.210 [2024-04-19 10:25:50.240349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583998975 len:48831 00:07:28.210 [2024-04-19 10:25:50.240377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.240424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:28.210 [2024-04-19 10:25:50.240442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.240472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.210 [2024-04-19 10:25:50.240488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.240516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:28.210 [2024-04-19 10:25:50.240532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.210 #75 NEW cov: 11925 ft: 13876 corp: 20/840b lim: 50 exec/s: 75 rss: 70Mb L: 48/50 MS: 1 ShuffleBytes- 00:07:28.210 [2024-04-19 10:25:50.290476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723672161012678655 len:65407 00:07:28.210 [2024-04-19 10:25:50.290504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.290551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744704586529029822 len:65407 00:07:28.210 [2024-04-19 10:25:50.290568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.290597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.210 [2024-04-19 10:25:50.290613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.210 [2024-04-19 10:25:50.290642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:28.210 [2024-04-19 10:25:50.290658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.469 #76 NEW cov: 11925 ft: 13900 corp: 21/888b lim: 50 exec/s: 76 rss: 70Mb L: 48/50 MS: 1 ShuffleBytes- 00:07:28.469 [2024-04-19 10:25:50.340477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167837696 len:11 00:07:28.469 [2024-04-19 10:25:50.340505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.469 #79 NEW cov: 11925 ft: 13906 corp: 22/898b lim: 50 exec/s: 79 rss: 70Mb L: 10/50 MS: 3 CrossOver-EraseBytes-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:28.469 [2024-04-19 10:25:50.410825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13694038035572719550 len:48831 00:07:28.469 [2024-04-19 10:25:50.410855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.469 [2024-04-19 10:25:50.410901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839234567870 len:48831 00:07:28.469 [2024-04-19 10:25:50.410927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.469 [2024-04-19 10:25:50.410960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.469 [2024-04-19 10:25:50.410977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.469 [2024-04-19 10:25:50.411005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:28.469 [2024-04-19 10:25:50.411021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.469 #80 NEW cov: 11925 ft: 13932 corp: 23/946b lim: 50 exec/s: 80 rss: 70Mb L: 48/50 MS: 1 ShuffleBytes- 00:07:28.469 [2024-04-19 10:25:50.460838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446473636983324350 len:3083 00:07:28.469 [2024-04-19 10:25:50.460867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.469 #81 NEW cov: 11925 ft: 13946 corp: 24/958b lim: 50 exec/s: 81 rss: 70Mb L: 12/50 MS: 1 ChangeBinInt- 00:07:28.469 [2024-04-19 10:25:50.510955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:184532481 len:1 00:07:28.469 [2024-04-19 10:25:50.510984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.469 #82 NEW cov: 11925 ft: 13973 corp: 25/970b lim: 50 exec/s: 82 rss: 70Mb L: 12/50 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:28.469 [2024-04-19 10:25:50.561154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583998975 len:48831 00:07:28.469 [2024-04-19 10:25:50.561182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.469 [2024-04-19 10:25:50.561229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839234567870 len:48831 00:07:28.469 [2024-04-19 10:25:50.561247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.469 [2024-04-19 10:25:50.561277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.469 [2024-04-19 10:25:50.561293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.728 #88 NEW cov: 11925 ft: 14238 corp: 26/1000b lim: 50 exec/s: 88 rss: 70Mb L: 30/50 MS: 1 EraseBytes- 00:07:28.728 [2024-04-19 10:25:50.611305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583945215 len:48831 00:07:28.728 [2024-04-19 10:25:50.611334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.611381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744648230014992062 len:48831 00:07:28.728 [2024-04-19 10:25:50.611398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.611428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.728 [2024-04-19 10:25:50.611444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.611473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:28.728 [2024-04-19 10:25:50.611489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.728 #89 NEW cov: 11925 ft: 14241 corp: 27/1048b lim: 50 exec/s: 89 rss: 71Mb L: 48/50 MS: 1 ChangeBinInt- 00:07:28.728 [2024-04-19 10:25:50.671368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:28.728 [2024-04-19 10:25:50.671396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.728 #92 NEW cov: 11925 ft: 14255 corp: 28/1060b lim: 50 exec/s: 92 rss: 71Mb L: 12/50 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:28.728 [2024-04-19 10:25:50.731612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723672161012678655 len:65407 00:07:28.728 [2024-04-19 10:25:50.731642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.731690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744704586529029822 len:65407 00:07:28.728 [2024-04-19 10:25:50.731707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.731737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.728 [2024-04-19 10:25:50.731753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.731781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:28.728 [2024-04-19 10:25:50.731798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.728 #93 NEW cov: 11925 ft: 14264 corp: 29/1108b lim: 50 exec/s: 93 rss: 71Mb L: 48/50 MS: 1 ShuffleBytes- 00:07:28.728 [2024-04-19 10:25:50.781750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723600413583945215 len:48831 00:07:28.728 [2024-04-19 10:25:50.781779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.781832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:28.728 [2024-04-19 10:25:50.781849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.781879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.728 [2024-04-19 10:25:50.781895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.728 [2024-04-19 10:25:50.781923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:413977942217899710 len:48831 00:07:28.728 [2024-04-19 10:25:50.781939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.728 #94 NEW cov: 11925 ft: 14283 corp: 30/1148b lim: 50 exec/s: 94 rss: 71Mb L: 40/50 MS: 1 EraseBytes- 00:07:28.987 [2024-04-19 10:25:50.841982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:774265136297803775 len:48831 00:07:28.987 [2024-04-19 10:25:50.842012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.987 [2024-04-19 10:25:50.842044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13744632839100350142 len:48831 00:07:28.987 [2024-04-19 10:25:50.842071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.987 [2024-04-19 10:25:50.842102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13744632839234567870 len:48831 00:07:28.987 [2024-04-19 10:25:50.842122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.987 [2024-04-19 10:25:50.842151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13744632839234567870 len:48831 00:07:28.987 [2024-04-19 10:25:50.842168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.987 #95 NEW cov: 11925 ft: 14291 corp: 31/1196b lim: 50 exec/s: 47 rss: 71Mb L: 48/50 MS: 1 ShuffleBytes- 00:07:28.987 #95 DONE cov: 11925 ft: 14291 corp: 31/1196b lim: 50 exec/s: 47 rss: 71Mb 00:07:28.987 ###### Recommended dictionary. ###### 00:07:28.987 "\377\377\377~" # Uses: 2 00:07:28.987 "\005\000\000\000\000\000\000\000" # Uses: 0 00:07:28.987 "\001\000\000\000\000\000\000\000" # Uses: 1 00:07:28.987 ###### End of recommended dictionary. ###### 00:07:28.987 Done 95 runs in 2 second(s) 00:07:28.987 10:25:50 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:07:28.987 10:25:50 -- ../common.sh@72 -- # (( i++ )) 00:07:28.987 10:25:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:28.987 10:25:50 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:28.987 10:25:50 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:28.987 10:25:50 -- nvmf/run.sh@24 -- # local timen=1 00:07:28.987 10:25:50 -- nvmf/run.sh@25 -- # local core=0x1 00:07:28.987 10:25:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:28.987 10:25:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:28.987 10:25:50 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:28.987 10:25:50 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:28.987 10:25:51 -- nvmf/run.sh@34 -- # printf %02d 20 00:07:28.987 10:25:51 -- nvmf/run.sh@34 -- # port=4420 00:07:28.987 10:25:51 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:28.987 10:25:51 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:28.987 10:25:51 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:28.987 10:25:51 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:28.987 10:25:51 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:28.987 10:25:51 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:07:28.987 [2024-04-19 10:25:51.041992] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:28.987 [2024-04-19 10:25:51.042074] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid207204 ] 00:07:28.987 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.245 [2024-04-19 10:25:51.298488] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.503 [2024-04-19 10:25:51.381502] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.503 [2024-04-19 10:25:51.440538] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:29.503 [2024-04-19 10:25:51.456677] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:07:29.503 INFO: Running with entropic power schedule (0xFF, 100). 00:07:29.503 INFO: Seed: 2696950816 00:07:29.503 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:29.503 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:29.503 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:29.503 INFO: A corpus is not provided, starting from an empty corpus 00:07:29.503 #2 INITED exec/s: 0 rss: 63Mb 00:07:29.503 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:29.503 This may also happen if the target rejected all inputs we tried so far 00:07:29.503 [2024-04-19 10:25:51.511877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:29.503 [2024-04-19 10:25:51.511906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.762 NEW_FUNC[1/672]: 0x4a3fb0 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:07:29.762 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:29.762 #3 NEW cov: 11739 ft: 11739 corp: 2/20b lim: 90 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:29.762 [2024-04-19 10:25:51.842976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:29.762 [2024-04-19 10:25:51.843025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.762 [2024-04-19 10:25:51.843093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:29.762 [2024-04-19 10:25:51.843115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.020 #4 NEW cov: 11869 ft: 12935 corp: 3/58b lim: 90 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CrossOver- 00:07:30.020 [2024-04-19 10:25:51.892883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.020 [2024-04-19 10:25:51.892911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.020 #5 NEW cov: 11875 ft: 13130 corp: 4/80b lim: 90 exec/s: 0 rss: 69Mb L: 22/38 MS: 1 EraseBytes- 00:07:30.020 [2024-04-19 10:25:51.932934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.021 [2024-04-19 10:25:51.932960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.021 #6 NEW cov: 11960 ft: 13503 corp: 5/104b lim: 90 exec/s: 0 rss: 69Mb L: 24/38 MS: 1 CopyPart- 00:07:30.021 [2024-04-19 10:25:51.973213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.021 [2024-04-19 10:25:51.973239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.021 [2024-04-19 10:25:51.973291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.021 [2024-04-19 10:25:51.973307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.021 #7 NEW cov: 11960 ft: 13615 corp: 6/145b lim: 90 exec/s: 0 rss: 69Mb L: 41/41 MS: 1 CrossOver- 00:07:30.021 [2024-04-19 10:25:52.023338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.021 [2024-04-19 10:25:52.023365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.021 [2024-04-19 10:25:52.023402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.021 [2024-04-19 10:25:52.023419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.021 #8 NEW cov: 11960 ft: 13733 corp: 7/198b lim: 90 exec/s: 0 rss: 70Mb L: 53/53 MS: 1 InsertRepeatedBytes- 00:07:30.021 [2024-04-19 10:25:52.063448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.021 [2024-04-19 10:25:52.063474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.021 [2024-04-19 10:25:52.063512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.021 [2024-04-19 10:25:52.063531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.021 #12 NEW cov: 11960 ft: 13774 corp: 8/235b lim: 90 exec/s: 0 rss: 70Mb L: 37/53 MS: 4 CrossOver-CopyPart-CopyPart-InsertRepeatedBytes- 00:07:30.021 [2024-04-19 10:25:52.103540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.021 [2024-04-19 10:25:52.103566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.021 [2024-04-19 10:25:52.103618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.021 [2024-04-19 10:25:52.103633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.021 #13 NEW cov: 11960 ft: 13843 corp: 9/277b lim: 90 exec/s: 0 rss: 70Mb L: 42/53 MS: 1 InsertRepeatedBytes- 00:07:30.279 [2024-04-19 10:25:52.143527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.279 [2024-04-19 10:25:52.143555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.279 #14 NEW cov: 11960 ft: 13924 corp: 10/310b lim: 90 exec/s: 0 rss: 70Mb L: 33/53 MS: 1 EraseBytes- 00:07:30.279 [2024-04-19 10:25:52.184295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.279 [2024-04-19 10:25:52.184323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.279 [2024-04-19 10:25:52.184376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.279 [2024-04-19 10:25:52.184392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.279 [2024-04-19 10:25:52.184449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.279 [2024-04-19 10:25:52.184465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.279 [2024-04-19 10:25:52.184518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:30.279 [2024-04-19 10:25:52.184534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.279 #15 NEW cov: 11960 ft: 14499 corp: 11/388b lim: 90 exec/s: 0 rss: 70Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:07:30.279 [2024-04-19 10:25:52.233756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.279 [2024-04-19 10:25:52.233783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.279 #21 NEW cov: 11960 ft: 14518 corp: 12/407b lim: 90 exec/s: 0 rss: 70Mb L: 19/78 MS: 1 ChangeByte- 00:07:30.279 [2024-04-19 10:25:52.274040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.279 [2024-04-19 10:25:52.274067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.279 [2024-04-19 10:25:52.274120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.280 [2024-04-19 10:25:52.274136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.280 #22 NEW cov: 11960 ft: 14592 corp: 13/445b lim: 90 exec/s: 0 rss: 70Mb L: 38/78 MS: 1 ShuffleBytes- 00:07:30.280 [2024-04-19 10:25:52.313996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.280 [2024-04-19 10:25:52.314023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.280 #23 NEW cov: 11960 ft: 14613 corp: 14/476b lim: 90 exec/s: 0 rss: 70Mb L: 31/78 MS: 1 EraseBytes- 00:07:30.280 [2024-04-19 10:25:52.364288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.280 [2024-04-19 10:25:52.364314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.280 [2024-04-19 10:25:52.364353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.280 [2024-04-19 10:25:52.364370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.280 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:30.280 #25 NEW cov: 11983 ft: 14628 corp: 15/515b lim: 90 exec/s: 0 rss: 70Mb L: 39/78 MS: 2 CopyPart-CrossOver- 00:07:30.538 [2024-04-19 10:25:52.404362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.538 [2024-04-19 10:25:52.404388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.538 [2024-04-19 10:25:52.404432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.538 [2024-04-19 10:25:52.404448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.538 #26 NEW cov: 11983 ft: 14701 corp: 16/568b lim: 90 exec/s: 0 rss: 70Mb L: 53/78 MS: 1 CrossOver- 00:07:30.538 [2024-04-19 10:25:52.444487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.538 [2024-04-19 10:25:52.444512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.538 [2024-04-19 10:25:52.444565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.538 [2024-04-19 10:25:52.444582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.538 #27 NEW cov: 11983 ft: 14815 corp: 17/621b lim: 90 exec/s: 0 rss: 70Mb L: 53/78 MS: 1 CopyPart- 00:07:30.538 [2024-04-19 10:25:52.494478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.538 [2024-04-19 10:25:52.494505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.538 #29 NEW cov: 11983 ft: 14829 corp: 18/650b lim: 90 exec/s: 29 rss: 70Mb L: 29/78 MS: 2 EraseBytes-CrossOver- 00:07:30.538 [2024-04-19 10:25:52.534771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.538 [2024-04-19 10:25:52.534796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.538 [2024-04-19 10:25:52.534838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.538 [2024-04-19 10:25:52.534852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.538 #30 NEW cov: 11983 ft: 14831 corp: 19/689b lim: 90 exec/s: 30 rss: 70Mb L: 39/78 MS: 1 ChangeByte- 00:07:30.538 [2024-04-19 10:25:52.584910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.538 [2024-04-19 10:25:52.584937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.538 [2024-04-19 10:25:52.584975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.538 [2024-04-19 10:25:52.584991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.538 #31 NEW cov: 11983 ft: 14838 corp: 20/730b lim: 90 exec/s: 31 rss: 70Mb L: 41/78 MS: 1 CopyPart- 00:07:30.538 [2024-04-19 10:25:52.625014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.538 [2024-04-19 10:25:52.625040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.538 [2024-04-19 10:25:52.625096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.538 [2024-04-19 10:25:52.625111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.797 #32 NEW cov: 11983 ft: 14856 corp: 21/771b lim: 90 exec/s: 32 rss: 70Mb L: 41/78 MS: 1 CopyPart- 00:07:30.797 [2024-04-19 10:25:52.665413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.797 [2024-04-19 10:25:52.665440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.665489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.797 [2024-04-19 10:25:52.665505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.665560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.797 [2024-04-19 10:25:52.665575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.665629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:30.797 [2024-04-19 10:25:52.665645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.797 #33 NEW cov: 11983 ft: 14911 corp: 22/848b lim: 90 exec/s: 33 rss: 71Mb L: 77/78 MS: 1 InsertRepeatedBytes- 00:07:30.797 [2024-04-19 10:25:52.715303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.797 [2024-04-19 10:25:52.715328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.715368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.797 [2024-04-19 10:25:52.715384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.797 #39 NEW cov: 11983 ft: 14953 corp: 23/901b lim: 90 exec/s: 39 rss: 71Mb L: 53/78 MS: 1 CopyPart- 00:07:30.797 [2024-04-19 10:25:52.755365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.797 [2024-04-19 10:25:52.755390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.755433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.797 [2024-04-19 10:25:52.755448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.797 #40 NEW cov: 11983 ft: 14968 corp: 24/943b lim: 90 exec/s: 40 rss: 71Mb L: 42/78 MS: 1 InsertByte- 00:07:30.797 [2024-04-19 10:25:52.805854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.797 [2024-04-19 10:25:52.805880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.805935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.797 [2024-04-19 10:25:52.805951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.806007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.797 [2024-04-19 10:25:52.806026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.806082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:30.797 [2024-04-19 10:25:52.806098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.797 #41 NEW cov: 11983 ft: 14986 corp: 25/1020b lim: 90 exec/s: 41 rss: 71Mb L: 77/78 MS: 1 InsertRepeatedBytes- 00:07:30.797 [2024-04-19 10:25:52.855669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.797 [2024-04-19 10:25:52.855696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.797 [2024-04-19 10:25:52.855747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.797 [2024-04-19 10:25:52.855763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.797 #42 NEW cov: 11983 ft: 15015 corp: 26/1057b lim: 90 exec/s: 42 rss: 71Mb L: 37/78 MS: 1 ShuffleBytes- 00:07:30.797 [2024-04-19 10:25:52.895636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.797 [2024-04-19 10:25:52.895663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.056 #43 NEW cov: 11983 ft: 15021 corp: 27/1088b lim: 90 exec/s: 43 rss: 71Mb L: 31/78 MS: 1 ChangeBinInt- 00:07:31.056 [2024-04-19 10:25:52.935749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.056 [2024-04-19 10:25:52.935776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.056 #44 NEW cov: 11983 ft: 15134 corp: 28/1119b lim: 90 exec/s: 44 rss: 71Mb L: 31/78 MS: 1 ChangeBinInt- 00:07:31.056 [2024-04-19 10:25:52.976213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.056 [2024-04-19 10:25:52.976239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.056 [2024-04-19 10:25:52.976279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.056 [2024-04-19 10:25:52.976295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.056 [2024-04-19 10:25:52.976353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.056 [2024-04-19 10:25:52.976369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.056 #45 NEW cov: 11983 ft: 15433 corp: 29/1176b lim: 90 exec/s: 45 rss: 71Mb L: 57/78 MS: 1 CopyPart- 00:07:31.056 [2024-04-19 10:25:53.016379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.056 [2024-04-19 10:25:53.016405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.056 [2024-04-19 10:25:53.016462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.056 [2024-04-19 10:25:53.016478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.056 [2024-04-19 10:25:53.016530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.056 [2024-04-19 10:25:53.016546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.056 [2024-04-19 10:25:53.016601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.056 [2024-04-19 10:25:53.016619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.056 #46 NEW cov: 11983 ft: 15470 corp: 30/1254b lim: 90 exec/s: 46 rss: 71Mb L: 78/78 MS: 1 InsertByte- 00:07:31.056 [2024-04-19 10:25:53.066127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.056 [2024-04-19 10:25:53.066154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.056 #47 NEW cov: 11983 ft: 15476 corp: 31/1285b lim: 90 exec/s: 47 rss: 72Mb L: 31/78 MS: 1 ShuffleBytes- 00:07:31.056 [2024-04-19 10:25:53.106254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.056 [2024-04-19 10:25:53.106281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.056 #48 NEW cov: 11983 ft: 15482 corp: 32/1304b lim: 90 exec/s: 48 rss: 72Mb L: 19/78 MS: 1 EraseBytes- 00:07:31.056 [2024-04-19 10:25:53.146305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.056 [2024-04-19 10:25:53.146331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.314 #49 NEW cov: 11983 ft: 15546 corp: 33/1326b lim: 90 exec/s: 49 rss: 72Mb L: 22/78 MS: 1 ChangeBinInt- 00:07:31.314 [2024-04-19 10:25:53.186577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.314 [2024-04-19 10:25:53.186605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.314 [2024-04-19 10:25:53.186643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.314 [2024-04-19 10:25:53.186659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.314 #50 NEW cov: 11983 ft: 15557 corp: 34/1379b lim: 90 exec/s: 50 rss: 72Mb L: 53/78 MS: 1 ShuffleBytes- 00:07:31.314 [2024-04-19 10:25:53.237058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.314 [2024-04-19 10:25:53.237085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.314 [2024-04-19 10:25:53.237138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.314 [2024-04-19 10:25:53.237154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.314 [2024-04-19 10:25:53.237209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.314 [2024-04-19 10:25:53.237224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.314 [2024-04-19 10:25:53.237280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.314 [2024-04-19 10:25:53.237295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.314 #51 NEW cov: 11983 ft: 15570 corp: 35/1459b lim: 90 exec/s: 51 rss: 72Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:07:31.314 [2024-04-19 10:25:53.286718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.314 [2024-04-19 10:25:53.286745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.314 #52 NEW cov: 11983 ft: 15608 corp: 36/1491b lim: 90 exec/s: 52 rss: 72Mb L: 32/80 MS: 1 EraseBytes- 00:07:31.314 [2024-04-19 10:25:53.326986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.314 [2024-04-19 10:25:53.327013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.314 [2024-04-19 10:25:53.327077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.314 [2024-04-19 10:25:53.327094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.314 #53 NEW cov: 11983 ft: 15642 corp: 37/1532b lim: 90 exec/s: 53 rss: 72Mb L: 41/80 MS: 1 ChangeBit- 00:07:31.314 [2024-04-19 10:25:53.367111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.314 [2024-04-19 10:25:53.367136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.314 [2024-04-19 10:25:53.367178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.314 [2024-04-19 10:25:53.367194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.314 #54 NEW cov: 11983 ft: 15656 corp: 38/1574b lim: 90 exec/s: 54 rss: 72Mb L: 42/80 MS: 1 ShuffleBytes- 00:07:31.314 [2024-04-19 10:25:53.407515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.314 [2024-04-19 10:25:53.407540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.314 [2024-04-19 10:25:53.407587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.314 [2024-04-19 10:25:53.407602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.314 [2024-04-19 10:25:53.407657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.315 [2024-04-19 10:25:53.407673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.315 [2024-04-19 10:25:53.407743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.315 [2024-04-19 10:25:53.407760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.573 #55 NEW cov: 11983 ft: 15697 corp: 39/1653b lim: 90 exec/s: 55 rss: 72Mb L: 79/80 MS: 1 InsertByte- 00:07:31.573 [2024-04-19 10:25:53.457183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.573 [2024-04-19 10:25:53.457210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.573 #56 NEW cov: 11983 ft: 15724 corp: 40/1674b lim: 90 exec/s: 56 rss: 72Mb L: 21/80 MS: 1 EraseBytes- 00:07:31.573 [2024-04-19 10:25:53.497476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.573 [2024-04-19 10:25:53.497503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.573 [2024-04-19 10:25:53.497554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.573 [2024-04-19 10:25:53.497570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.573 #57 NEW cov: 11983 ft: 15734 corp: 41/1712b lim: 90 exec/s: 28 rss: 72Mb L: 38/80 MS: 1 InsertByte- 00:07:31.573 #57 DONE cov: 11983 ft: 15734 corp: 41/1712b lim: 90 exec/s: 28 rss: 72Mb 00:07:31.573 Done 57 runs in 2 second(s) 00:07:31.573 10:25:53 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:07:31.573 10:25:53 -- ../common.sh@72 -- # (( i++ )) 00:07:31.573 10:25:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:31.573 10:25:53 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:07:31.573 10:25:53 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:07:31.573 10:25:53 -- nvmf/run.sh@24 -- # local timen=1 00:07:31.573 10:25:53 -- nvmf/run.sh@25 -- # local core=0x1 00:07:31.573 10:25:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:31.573 10:25:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:07:31.573 10:25:53 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:31.573 10:25:53 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:31.573 10:25:53 -- nvmf/run.sh@34 -- # printf %02d 21 00:07:31.573 10:25:53 -- nvmf/run.sh@34 -- # port=4421 00:07:31.573 10:25:53 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:31.573 10:25:53 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:07:31.574 10:25:53 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:31.574 10:25:53 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:31.574 10:25:53 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:31.574 10:25:53 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:07:31.574 [2024-04-19 10:25:53.682740] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:31.574 [2024-04-19 10:25:53.682819] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid207552 ] 00:07:31.832 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.832 [2024-04-19 10:25:53.937603] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.091 [2024-04-19 10:25:54.020940] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.091 [2024-04-19 10:25:54.079897] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.091 [2024-04-19 10:25:54.096040] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:07:32.091 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.091 INFO: Seed: 1041976144 00:07:32.091 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:32.091 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:32.091 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:32.091 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.091 #2 INITED exec/s: 0 rss: 63Mb 00:07:32.091 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:32.091 This may also happen if the target rejected all inputs we tried so far 00:07:32.091 [2024-04-19 10:25:54.140967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.091 [2024-04-19 10:25:54.141002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.091 [2024-04-19 10:25:54.141052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.091 [2024-04-19 10:25:54.141070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.091 [2024-04-19 10:25:54.141101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.091 [2024-04-19 10:25:54.141117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.091 [2024-04-19 10:25:54.141147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:32.091 [2024-04-19 10:25:54.141163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.658 NEW_FUNC[1/672]: 0x4a71d0 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:07:32.658 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:32.658 #17 NEW cov: 11713 ft: 11715 corp: 2/47b lim: 50 exec/s: 0 rss: 69Mb L: 46/46 MS: 5 ShuffleBytes-ChangeByte-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:32.658 [2024-04-19 10:25:54.481708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.658 [2024-04-19 10:25:54.481754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.658 [2024-04-19 10:25:54.481807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.658 [2024-04-19 10:25:54.481835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.658 [2024-04-19 10:25:54.481867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.658 [2024-04-19 10:25:54.481883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.658 #20 NEW cov: 11844 ft: 12419 corp: 3/85b lim: 50 exec/s: 0 rss: 69Mb L: 38/46 MS: 3 CMP-ShuffleBytes-InsertRepeatedBytes- DE: "\001\000\002\000"- 00:07:32.658 [2024-04-19 10:25:54.541731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.658 [2024-04-19 10:25:54.541761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.658 [2024-04-19 10:25:54.541823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.658 [2024-04-19 10:25:54.541841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.658 [2024-04-19 10:25:54.541873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.658 [2024-04-19 10:25:54.541889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.658 #21 NEW cov: 11850 ft: 12771 corp: 4/123b lim: 50 exec/s: 0 rss: 69Mb L: 38/46 MS: 1 CrossOver- 00:07:32.658 [2024-04-19 10:25:54.611967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.658 [2024-04-19 10:25:54.611997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.658 [2024-04-19 10:25:54.612045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.658 [2024-04-19 10:25:54.612063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.658 [2024-04-19 10:25:54.612094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.658 [2024-04-19 10:25:54.612110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.658 [2024-04-19 10:25:54.612140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:32.658 [2024-04-19 10:25:54.612156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.658 #22 NEW cov: 11935 ft: 13009 corp: 5/170b lim: 50 exec/s: 0 rss: 69Mb L: 47/47 MS: 1 InsertByte- 00:07:32.658 [2024-04-19 10:25:54.682152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.658 [2024-04-19 10:25:54.682182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.658 [2024-04-19 10:25:54.682230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.658 [2024-04-19 10:25:54.682247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.659 [2024-04-19 10:25:54.682282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.659 [2024-04-19 10:25:54.682298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.659 [2024-04-19 10:25:54.682328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:32.659 [2024-04-19 10:25:54.682344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.659 #23 NEW cov: 11935 ft: 13058 corp: 6/219b lim: 50 exec/s: 0 rss: 69Mb L: 49/49 MS: 1 CopyPart- 00:07:32.659 [2024-04-19 10:25:54.742242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.659 [2024-04-19 10:25:54.742271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.659 [2024-04-19 10:25:54.742320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.659 [2024-04-19 10:25:54.742337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.659 [2024-04-19 10:25:54.742368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.659 [2024-04-19 10:25:54.742385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.917 #24 NEW cov: 11935 ft: 13166 corp: 7/258b lim: 50 exec/s: 0 rss: 70Mb L: 39/49 MS: 1 CrossOver- 00:07:32.917 [2024-04-19 10:25:54.812542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.917 [2024-04-19 10:25:54.812571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:54.812618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.917 [2024-04-19 10:25:54.812636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:54.812667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.917 [2024-04-19 10:25:54.812683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:54.812712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:32.917 [2024-04-19 10:25:54.812728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:54.812757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:32.917 [2024-04-19 10:25:54.812773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:32.917 #25 NEW cov: 11935 ft: 13310 corp: 8/308b lim: 50 exec/s: 0 rss: 70Mb L: 50/50 MS: 1 InsertByte- 00:07:32.917 [2024-04-19 10:25:54.882609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.917 [2024-04-19 10:25:54.882639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:54.882688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.917 [2024-04-19 10:25:54.882706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:54.882737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.917 [2024-04-19 10:25:54.882757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.917 #26 NEW cov: 11935 ft: 13389 corp: 9/346b lim: 50 exec/s: 0 rss: 70Mb L: 38/50 MS: 1 ChangeBit- 00:07:32.917 [2024-04-19 10:25:54.932726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.917 [2024-04-19 10:25:54.932755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:54.932803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.917 [2024-04-19 10:25:54.932832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:54.932864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.917 [2024-04-19 10:25:54.932880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.917 #27 NEW cov: 11935 ft: 13418 corp: 10/385b lim: 50 exec/s: 0 rss: 70Mb L: 39/50 MS: 1 CopyPart- 00:07:32.917 [2024-04-19 10:25:55.002923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.917 [2024-04-19 10:25:55.002951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:55.003000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.917 [2024-04-19 10:25:55.003017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.917 [2024-04-19 10:25:55.003048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.917 [2024-04-19 10:25:55.003065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.176 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:33.176 #33 NEW cov: 11958 ft: 13485 corp: 11/424b lim: 50 exec/s: 0 rss: 70Mb L: 39/50 MS: 1 InsertByte- 00:07:33.176 [2024-04-19 10:25:55.053077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.176 [2024-04-19 10:25:55.053108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.176 [2024-04-19 10:25:55.053157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.176 [2024-04-19 10:25:55.053175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.176 [2024-04-19 10:25:55.053206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.176 [2024-04-19 10:25:55.053222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.176 [2024-04-19 10:25:55.053251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.176 [2024-04-19 10:25:55.053267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.176 #34 NEW cov: 11958 ft: 13526 corp: 12/473b lim: 50 exec/s: 0 rss: 70Mb L: 49/50 MS: 1 InsertRepeatedBytes- 00:07:33.176 [2024-04-19 10:25:55.124356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.176 [2024-04-19 10:25:55.124385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.176 [2024-04-19 10:25:55.124455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.176 [2024-04-19 10:25:55.124471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.176 [2024-04-19 10:25:55.124532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.176 [2024-04-19 10:25:55.124546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.176 [2024-04-19 10:25:55.124601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.176 [2024-04-19 10:25:55.124618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.176 [2024-04-19 10:25:55.124672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.176 [2024-04-19 10:25:55.124687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.176 #35 NEW cov: 11958 ft: 13703 corp: 13/523b lim: 50 exec/s: 35 rss: 70Mb L: 50/50 MS: 1 CopyPart- 00:07:33.176 [2024-04-19 10:25:55.174303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.177 [2024-04-19 10:25:55.174329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.174400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.177 [2024-04-19 10:25:55.174416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.174475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.177 [2024-04-19 10:25:55.174491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.174548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.177 [2024-04-19 10:25:55.174563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.177 #36 NEW cov: 11958 ft: 13735 corp: 14/572b lim: 50 exec/s: 36 rss: 70Mb L: 49/50 MS: 1 CopyPart- 00:07:33.177 [2024-04-19 10:25:55.224411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.177 [2024-04-19 10:25:55.224437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.224505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.177 [2024-04-19 10:25:55.224522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.224579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.177 [2024-04-19 10:25:55.224595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.224652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.177 [2024-04-19 10:25:55.224668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.177 #37 NEW cov: 11958 ft: 13806 corp: 15/620b lim: 50 exec/s: 37 rss: 70Mb L: 48/50 MS: 1 InsertRepeatedBytes- 00:07:33.177 [2024-04-19 10:25:55.264710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.177 [2024-04-19 10:25:55.264737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.264788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.177 [2024-04-19 10:25:55.264807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.264884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.177 [2024-04-19 10:25:55.264900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.264957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.177 [2024-04-19 10:25:55.264973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.177 [2024-04-19 10:25:55.265030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.177 [2024-04-19 10:25:55.265048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.436 #38 NEW cov: 11958 ft: 13818 corp: 16/670b lim: 50 exec/s: 38 rss: 70Mb L: 50/50 MS: 1 CrossOver- 00:07:33.436 [2024-04-19 10:25:55.314351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.436 [2024-04-19 10:25:55.314378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.314419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.436 [2024-04-19 10:25:55.314435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.436 #39 NEW cov: 11958 ft: 14219 corp: 17/691b lim: 50 exec/s: 39 rss: 70Mb L: 21/50 MS: 1 EraseBytes- 00:07:33.436 [2024-04-19 10:25:55.365003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.436 [2024-04-19 10:25:55.365029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.365088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.436 [2024-04-19 10:25:55.365104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.365159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.436 [2024-04-19 10:25:55.365174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.365228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.436 [2024-04-19 10:25:55.365244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.365300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.436 [2024-04-19 10:25:55.365315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.436 #40 NEW cov: 11958 ft: 14229 corp: 18/741b lim: 50 exec/s: 40 rss: 70Mb L: 50/50 MS: 1 CrossOver- 00:07:33.436 [2024-04-19 10:25:55.404963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.436 [2024-04-19 10:25:55.404989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.405042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.436 [2024-04-19 10:25:55.405057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.405115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.436 [2024-04-19 10:25:55.405133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.405188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.436 [2024-04-19 10:25:55.405204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.436 #41 NEW cov: 11958 ft: 14248 corp: 19/784b lim: 50 exec/s: 41 rss: 70Mb L: 43/50 MS: 1 InsertRepeatedBytes- 00:07:33.436 [2024-04-19 10:25:55.445244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.436 [2024-04-19 10:25:55.445271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.445327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.436 [2024-04-19 10:25:55.445344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.445397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.436 [2024-04-19 10:25:55.445413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.445467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.436 [2024-04-19 10:25:55.445481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.436 [2024-04-19 10:25:55.445537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.436 [2024-04-19 10:25:55.445551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.436 #42 NEW cov: 11958 ft: 14262 corp: 20/834b lim: 50 exec/s: 42 rss: 70Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:33.436 [2024-04-19 10:25:55.485337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.437 [2024-04-19 10:25:55.485362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.437 [2024-04-19 10:25:55.485421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.437 [2024-04-19 10:25:55.485437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.437 [2024-04-19 10:25:55.485494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.437 [2024-04-19 10:25:55.485509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.437 [2024-04-19 10:25:55.485564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.437 [2024-04-19 10:25:55.485579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.437 [2024-04-19 10:25:55.485634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.437 [2024-04-19 10:25:55.485649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.437 #43 NEW cov: 11958 ft: 14271 corp: 21/884b lim: 50 exec/s: 43 rss: 70Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.437 [2024-04-19 10:25:55.525317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.437 [2024-04-19 10:25:55.525343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.437 [2024-04-19 10:25:55.525411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.437 [2024-04-19 10:25:55.525426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.437 [2024-04-19 10:25:55.525484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.437 [2024-04-19 10:25:55.525500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.437 [2024-04-19 10:25:55.525557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.437 [2024-04-19 10:25:55.525573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.695 #44 NEW cov: 11958 ft: 14286 corp: 22/930b lim: 50 exec/s: 44 rss: 70Mb L: 46/50 MS: 1 InsertRepeatedBytes- 00:07:33.695 [2024-04-19 10:25:55.565251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.695 [2024-04-19 10:25:55.565278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.695 [2024-04-19 10:25:55.565329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.695 [2024-04-19 10:25:55.565345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.565402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.696 [2024-04-19 10:25:55.565418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.696 #45 NEW cov: 11958 ft: 14326 corp: 23/968b lim: 50 exec/s: 45 rss: 70Mb L: 38/50 MS: 1 ShuffleBytes- 00:07:33.696 [2024-04-19 10:25:55.605542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.696 [2024-04-19 10:25:55.605568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.605636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.696 [2024-04-19 10:25:55.605653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.605706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.696 [2024-04-19 10:25:55.605721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.605779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.696 [2024-04-19 10:25:55.605794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.696 #46 NEW cov: 11958 ft: 14330 corp: 24/1015b lim: 50 exec/s: 46 rss: 70Mb L: 47/50 MS: 1 ChangeBit- 00:07:33.696 [2024-04-19 10:25:55.645647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.696 [2024-04-19 10:25:55.645674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.645718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.696 [2024-04-19 10:25:55.645734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.645788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.696 [2024-04-19 10:25:55.645803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.645867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.696 [2024-04-19 10:25:55.645883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.696 #47 NEW cov: 11958 ft: 14338 corp: 25/1062b lim: 50 exec/s: 47 rss: 70Mb L: 47/50 MS: 1 CMP- DE: "\000\031\371!n\271\014\""- 00:07:33.696 [2024-04-19 10:25:55.685600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.696 [2024-04-19 10:25:55.685625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.685674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.696 [2024-04-19 10:25:55.685690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.685747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.696 [2024-04-19 10:25:55.685762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.696 #48 NEW cov: 11958 ft: 14342 corp: 26/1100b lim: 50 exec/s: 48 rss: 70Mb L: 38/50 MS: 1 CopyPart- 00:07:33.696 [2024-04-19 10:25:55.725673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.696 [2024-04-19 10:25:55.725699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.725754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.696 [2024-04-19 10:25:55.725769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.725831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.696 [2024-04-19 10:25:55.725847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.696 #49 NEW cov: 11958 ft: 14402 corp: 27/1131b lim: 50 exec/s: 49 rss: 70Mb L: 31/50 MS: 1 EraseBytes- 00:07:33.696 [2024-04-19 10:25:55.766094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.696 [2024-04-19 10:25:55.766119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.766177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.696 [2024-04-19 10:25:55.766192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.766246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.696 [2024-04-19 10:25:55.766262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.766315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.696 [2024-04-19 10:25:55.766330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.766389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.696 [2024-04-19 10:25:55.766404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.696 #50 NEW cov: 11958 ft: 14409 corp: 28/1181b lim: 50 exec/s: 50 rss: 70Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.696 [2024-04-19 10:25:55.806106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.696 [2024-04-19 10:25:55.806134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.806176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.696 [2024-04-19 10:25:55.806191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.806249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.696 [2024-04-19 10:25:55.806265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.696 [2024-04-19 10:25:55.806322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.696 [2024-04-19 10:25:55.806338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.955 #51 NEW cov: 11958 ft: 14431 corp: 29/1229b lim: 50 exec/s: 51 rss: 71Mb L: 48/50 MS: 1 InsertByte- 00:07:33.955 [2024-04-19 10:25:55.845740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.955 [2024-04-19 10:25:55.845767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.955 #53 NEW cov: 11958 ft: 15209 corp: 30/1245b lim: 50 exec/s: 53 rss: 71Mb L: 16/50 MS: 2 InsertByte-CrossOver- 00:07:33.955 [2024-04-19 10:25:55.886442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.955 [2024-04-19 10:25:55.886468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.886520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.955 [2024-04-19 10:25:55.886536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.886593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.955 [2024-04-19 10:25:55.886607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.886664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.955 [2024-04-19 10:25:55.886680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.886736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.955 [2024-04-19 10:25:55.886751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.955 #54 NEW cov: 11958 ft: 15273 corp: 31/1295b lim: 50 exec/s: 54 rss: 71Mb L: 50/50 MS: 1 CrossOver- 00:07:33.955 [2024-04-19 10:25:55.926579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.955 [2024-04-19 10:25:55.926605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.926679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.955 [2024-04-19 10:25:55.926696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.926754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.955 [2024-04-19 10:25:55.926768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.926831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.955 [2024-04-19 10:25:55.926847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.926903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.955 [2024-04-19 10:25:55.926920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.955 #55 NEW cov: 11958 ft: 15280 corp: 32/1345b lim: 50 exec/s: 55 rss: 71Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.955 [2024-04-19 10:25:55.966563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.955 [2024-04-19 10:25:55.966588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.966640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.955 [2024-04-19 10:25:55.966656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.966709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.955 [2024-04-19 10:25:55.966726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:55.966782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.955 [2024-04-19 10:25:55.966798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.955 #56 NEW cov: 11958 ft: 15298 corp: 33/1388b lim: 50 exec/s: 56 rss: 71Mb L: 43/50 MS: 1 ChangeBit- 00:07:33.955 [2024-04-19 10:25:56.006493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.955 [2024-04-19 10:25:56.006518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:56.006564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.955 [2024-04-19 10:25:56.006579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:56.006637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.955 [2024-04-19 10:25:56.006653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.955 #57 NEW cov: 11958 ft: 15305 corp: 34/1419b lim: 50 exec/s: 57 rss: 71Mb L: 31/50 MS: 1 ChangeByte- 00:07:33.955 [2024-04-19 10:25:56.046468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.955 [2024-04-19 10:25:56.046493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.955 [2024-04-19 10:25:56.046532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.955 [2024-04-19 10:25:56.046548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.214 #58 NEW cov: 11958 ft: 15313 corp: 35/1447b lim: 50 exec/s: 58 rss: 71Mb L: 28/50 MS: 1 EraseBytes- 00:07:34.214 [2024-04-19 10:25:56.087027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.214 [2024-04-19 10:25:56.087054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.214 [2024-04-19 10:25:56.087110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.214 [2024-04-19 10:25:56.087126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.214 [2024-04-19 10:25:56.087199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.214 [2024-04-19 10:25:56.087215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.214 [2024-04-19 10:25:56.087273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.214 [2024-04-19 10:25:56.087288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.214 [2024-04-19 10:25:56.087346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.214 [2024-04-19 10:25:56.087361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.214 #59 NEW cov: 11958 ft: 15314 corp: 36/1497b lim: 50 exec/s: 59 rss: 71Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:34.214 [2024-04-19 10:25:56.137190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.214 [2024-04-19 10:25:56.137216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.214 [2024-04-19 10:25:56.137271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.214 [2024-04-19 10:25:56.137287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.214 [2024-04-19 10:25:56.137343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.214 [2024-04-19 10:25:56.137359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.214 [2024-04-19 10:25:56.137414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.214 [2024-04-19 10:25:56.137430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.214 [2024-04-19 10:25:56.137486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.214 [2024-04-19 10:25:56.137501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.214 #60 NEW cov: 11958 ft: 15335 corp: 37/1547b lim: 50 exec/s: 30 rss: 72Mb L: 50/50 MS: 1 PersAutoDict- DE: "\000\031\371!n\271\014\""- 00:07:34.214 #60 DONE cov: 11958 ft: 15335 corp: 37/1547b lim: 50 exec/s: 30 rss: 72Mb 00:07:34.214 ###### Recommended dictionary. ###### 00:07:34.214 "\001\000\002\000" # Uses: 0 00:07:34.214 "\000\031\371!n\271\014\"" # Uses: 1 00:07:34.214 ###### End of recommended dictionary. ###### 00:07:34.214 Done 60 runs in 2 second(s) 00:07:34.214 10:25:56 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:07:34.214 10:25:56 -- ../common.sh@72 -- # (( i++ )) 00:07:34.214 10:25:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.214 10:25:56 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:07:34.214 10:25:56 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:07:34.215 10:25:56 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.215 10:25:56 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.215 10:25:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:34.215 10:25:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:07:34.215 10:25:56 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:34.215 10:25:56 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:34.215 10:25:56 -- nvmf/run.sh@34 -- # printf %02d 22 00:07:34.215 10:25:56 -- nvmf/run.sh@34 -- # port=4422 00:07:34.215 10:25:56 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:34.215 10:25:56 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:07:34.215 10:25:56 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.215 10:25:56 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:34.215 10:25:56 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:34.215 10:25:56 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:07:34.215 [2024-04-19 10:25:56.318182] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:34.215 [2024-04-19 10:25:56.318261] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid207880 ] 00:07:34.473 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.473 [2024-04-19 10:25:56.500785] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.473 [2024-04-19 10:25:56.568740] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.732 [2024-04-19 10:25:56.628148] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:34.732 [2024-04-19 10:25:56.644283] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:07:34.732 INFO: Running with entropic power schedule (0xFF, 100). 00:07:34.732 INFO: Seed: 3589985285 00:07:34.732 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:34.732 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:34.732 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:34.732 INFO: A corpus is not provided, starting from an empty corpus 00:07:34.732 #2 INITED exec/s: 0 rss: 63Mb 00:07:34.732 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:34.732 This may also happen if the target rejected all inputs we tried so far 00:07:34.732 [2024-04-19 10:25:56.689034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:34.732 [2024-04-19 10:25:56.689067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.991 NEW_FUNC[1/672]: 0x4a9490 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:07:34.991 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.991 #17 NEW cov: 11740 ft: 11741 corp: 2/20b lim: 85 exec/s: 0 rss: 69Mb L: 19/19 MS: 5 ShuffleBytes-InsertByte-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:34.991 [2024-04-19 10:25:57.030148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:34.991 [2024-04-19 10:25:57.030193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.991 [2024-04-19 10:25:57.030231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:34.991 [2024-04-19 10:25:57.030249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.991 [2024-04-19 10:25:57.030279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:34.991 [2024-04-19 10:25:57.030296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.991 [2024-04-19 10:25:57.030324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:34.991 [2024-04-19 10:25:57.030340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.991 [2024-04-19 10:25:57.030373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:34.991 [2024-04-19 10:25:57.030390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.991 #22 NEW cov: 11870 ft: 13110 corp: 3/105b lim: 85 exec/s: 0 rss: 69Mb L: 85/85 MS: 5 ChangeBit-CrossOver-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:34.991 [2024-04-19 10:25:57.090057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:34.991 [2024-04-19 10:25:57.090090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.991 [2024-04-19 10:25:57.090124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:34.991 [2024-04-19 10:25:57.090142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.991 [2024-04-19 10:25:57.090173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:34.991 [2024-04-19 10:25:57.090190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.991 [2024-04-19 10:25:57.090219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:34.991 [2024-04-19 10:25:57.090236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.249 #23 NEW cov: 11876 ft: 13408 corp: 4/179b lim: 85 exec/s: 0 rss: 69Mb L: 74/85 MS: 1 InsertRepeatedBytes- 00:07:35.249 [2024-04-19 10:25:57.140020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.249 [2024-04-19 10:25:57.140051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.249 #28 NEW cov: 11961 ft: 13645 corp: 5/200b lim: 85 exec/s: 0 rss: 69Mb L: 21/85 MS: 5 CopyPart-ShuffleBytes-CopyPart-InsertByte-InsertRepeatedBytes- 00:07:35.249 [2024-04-19 10:25:57.190145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.249 [2024-04-19 10:25:57.190175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.249 #29 NEW cov: 11961 ft: 13804 corp: 6/222b lim: 85 exec/s: 0 rss: 69Mb L: 22/85 MS: 1 InsertByte- 00:07:35.249 [2024-04-19 10:25:57.260550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.249 [2024-04-19 10:25:57.260579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.249 [2024-04-19 10:25:57.260627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.249 [2024-04-19 10:25:57.260645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.249 [2024-04-19 10:25:57.260676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.249 [2024-04-19 10:25:57.260692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.249 [2024-04-19 10:25:57.260721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.249 [2024-04-19 10:25:57.260738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.249 [2024-04-19 10:25:57.260767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:35.249 [2024-04-19 10:25:57.260783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:35.249 #30 NEW cov: 11961 ft: 13865 corp: 7/307b lim: 85 exec/s: 0 rss: 70Mb L: 85/85 MS: 1 ChangeByte- 00:07:35.249 [2024-04-19 10:25:57.330512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.249 [2024-04-19 10:25:57.330543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.507 #31 NEW cov: 11961 ft: 13964 corp: 8/329b lim: 85 exec/s: 0 rss: 70Mb L: 22/85 MS: 1 CMP- DE: "\377\377"- 00:07:35.507 [2024-04-19 10:25:57.400668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.507 [2024-04-19 10:25:57.400697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.507 #32 NEW cov: 11961 ft: 13991 corp: 9/350b lim: 85 exec/s: 0 rss: 70Mb L: 21/85 MS: 1 ChangeBit- 00:07:35.507 [2024-04-19 10:25:57.450757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.507 [2024-04-19 10:25:57.450786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.507 #33 NEW cov: 11961 ft: 14021 corp: 10/371b lim: 85 exec/s: 0 rss: 70Mb L: 21/85 MS: 1 ChangeBinInt- 00:07:35.507 [2024-04-19 10:25:57.500899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.507 [2024-04-19 10:25:57.500928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.507 #34 NEW cov: 11961 ft: 14071 corp: 11/388b lim: 85 exec/s: 0 rss: 70Mb L: 17/85 MS: 1 EraseBytes- 00:07:35.507 [2024-04-19 10:25:57.561125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.507 [2024-04-19 10:25:57.561156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.508 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:35.508 #35 NEW cov: 11984 ft: 14124 corp: 12/415b lim: 85 exec/s: 0 rss: 70Mb L: 27/85 MS: 1 CopyPart- 00:07:35.508 [2024-04-19 10:25:57.611449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.508 [2024-04-19 10:25:57.611478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.508 [2024-04-19 10:25:57.611526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.508 [2024-04-19 10:25:57.611544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.508 [2024-04-19 10:25:57.611575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.508 [2024-04-19 10:25:57.611591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.508 [2024-04-19 10:25:57.611620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.508 [2024-04-19 10:25:57.611637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.508 [2024-04-19 10:25:57.611666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:35.508 [2024-04-19 10:25:57.611683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:35.767 #36 NEW cov: 11984 ft: 14146 corp: 13/500b lim: 85 exec/s: 0 rss: 70Mb L: 85/85 MS: 1 CopyPart- 00:07:35.767 [2024-04-19 10:25:57.681430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.767 [2024-04-19 10:25:57.681460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.767 #37 NEW cov: 11984 ft: 14229 corp: 14/522b lim: 85 exec/s: 37 rss: 70Mb L: 22/85 MS: 1 InsertByte- 00:07:35.767 [2024-04-19 10:25:57.731779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.767 [2024-04-19 10:25:57.731817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.767 [2024-04-19 10:25:57.731852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.767 [2024-04-19 10:25:57.731869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.767 [2024-04-19 10:25:57.731900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.767 [2024-04-19 10:25:57.731916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.767 [2024-04-19 10:25:57.731946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.767 [2024-04-19 10:25:57.731962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.767 [2024-04-19 10:25:57.731991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:35.767 [2024-04-19 10:25:57.732007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:35.767 #38 NEW cov: 11984 ft: 14239 corp: 15/607b lim: 85 exec/s: 38 rss: 70Mb L: 85/85 MS: 1 CopyPart- 00:07:35.767 [2024-04-19 10:25:57.781654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.767 [2024-04-19 10:25:57.781684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.767 #39 NEW cov: 11984 ft: 14306 corp: 16/629b lim: 85 exec/s: 39 rss: 70Mb L: 22/85 MS: 1 InsertByte- 00:07:35.767 [2024-04-19 10:25:57.831783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.767 [2024-04-19 10:25:57.831819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.026 #40 NEW cov: 11984 ft: 14318 corp: 17/656b lim: 85 exec/s: 40 rss: 70Mb L: 27/85 MS: 1 ChangeByte- 00:07:36.026 [2024-04-19 10:25:57.901960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.026 [2024-04-19 10:25:57.901989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.026 #41 NEW cov: 11984 ft: 14345 corp: 18/685b lim: 85 exec/s: 41 rss: 70Mb L: 29/85 MS: 1 PersAutoDict- DE: "\377\377"- 00:07:36.026 [2024-04-19 10:25:57.962221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.026 [2024-04-19 10:25:57.962249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.026 [2024-04-19 10:25:57.962297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.026 [2024-04-19 10:25:57.962315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.026 [2024-04-19 10:25:57.962347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.026 [2024-04-19 10:25:57.962363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.026 #42 NEW cov: 11984 ft: 14686 corp: 19/743b lim: 85 exec/s: 42 rss: 70Mb L: 58/85 MS: 1 EraseBytes- 00:07:36.026 [2024-04-19 10:25:58.012235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.026 [2024-04-19 10:25:58.012264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.026 #43 NEW cov: 11984 ft: 14726 corp: 20/770b lim: 85 exec/s: 43 rss: 70Mb L: 27/85 MS: 1 ShuffleBytes- 00:07:36.026 [2024-04-19 10:25:58.062393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.026 [2024-04-19 10:25:58.062423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.026 #54 NEW cov: 11984 ft: 14728 corp: 21/791b lim: 85 exec/s: 54 rss: 70Mb L: 21/85 MS: 1 ChangeByte- 00:07:36.026 [2024-04-19 10:25:58.112489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.026 [2024-04-19 10:25:58.112520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.285 #55 NEW cov: 11984 ft: 14777 corp: 22/811b lim: 85 exec/s: 55 rss: 70Mb L: 20/85 MS: 1 InsertByte- 00:07:36.285 [2024-04-19 10:25:58.182820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.285 [2024-04-19 10:25:58.182850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.182883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.285 [2024-04-19 10:25:58.182901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.182947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.285 [2024-04-19 10:25:58.182964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.285 #56 NEW cov: 11984 ft: 14794 corp: 23/869b lim: 85 exec/s: 56 rss: 70Mb L: 58/85 MS: 1 CrossOver- 00:07:36.285 [2024-04-19 10:25:58.252850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.285 [2024-04-19 10:25:58.252878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.285 #57 NEW cov: 11984 ft: 14834 corp: 24/899b lim: 85 exec/s: 57 rss: 70Mb L: 30/85 MS: 1 InsertRepeatedBytes- 00:07:36.285 [2024-04-19 10:25:58.323326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.285 [2024-04-19 10:25:58.323358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.323407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.285 [2024-04-19 10:25:58.323425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.323457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.285 [2024-04-19 10:25:58.323474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.323503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.285 [2024-04-19 10:25:58.323520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.323550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:36.285 [2024-04-19 10:25:58.323566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:36.285 #58 NEW cov: 11984 ft: 14866 corp: 25/984b lim: 85 exec/s: 58 rss: 70Mb L: 85/85 MS: 1 ChangeBinInt- 00:07:36.285 [2024-04-19 10:25:58.373395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.285 [2024-04-19 10:25:58.373428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.373477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.285 [2024-04-19 10:25:58.373494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.373525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.285 [2024-04-19 10:25:58.373541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.373570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.285 [2024-04-19 10:25:58.373586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.285 [2024-04-19 10:25:58.373615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:36.285 [2024-04-19 10:25:58.373631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:36.544 #59 NEW cov: 11984 ft: 14899 corp: 26/1069b lim: 85 exec/s: 59 rss: 70Mb L: 85/85 MS: 1 CopyPart- 00:07:36.544 [2024-04-19 10:25:58.443448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.544 [2024-04-19 10:25:58.443477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.544 [2024-04-19 10:25:58.443526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.544 [2024-04-19 10:25:58.443544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.544 [2024-04-19 10:25:58.443575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.544 [2024-04-19 10:25:58.443591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.544 #60 NEW cov: 11984 ft: 14916 corp: 27/1135b lim: 85 exec/s: 60 rss: 70Mb L: 66/85 MS: 1 EraseBytes- 00:07:36.544 [2024-04-19 10:25:58.493633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.544 [2024-04-19 10:25:58.493661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.544 [2024-04-19 10:25:58.493709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.545 [2024-04-19 10:25:58.493727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.545 [2024-04-19 10:25:58.493758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.545 [2024-04-19 10:25:58.493774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.545 [2024-04-19 10:25:58.493804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.545 [2024-04-19 10:25:58.493826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.545 #61 NEW cov: 11984 ft: 14953 corp: 28/1209b lim: 85 exec/s: 61 rss: 71Mb L: 74/85 MS: 1 ShuffleBytes- 00:07:36.545 [2024-04-19 10:25:58.563879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.545 [2024-04-19 10:25:58.563908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.545 [2024-04-19 10:25:58.563955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.545 [2024-04-19 10:25:58.563977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.545 [2024-04-19 10:25:58.564008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.545 [2024-04-19 10:25:58.564025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.545 [2024-04-19 10:25:58.564053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.545 [2024-04-19 10:25:58.564070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.545 [2024-04-19 10:25:58.564099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:36.545 [2024-04-19 10:25:58.564115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:36.545 #62 NEW cov: 11984 ft: 14964 corp: 29/1294b lim: 85 exec/s: 62 rss: 71Mb L: 85/85 MS: 1 ChangeByte- 00:07:36.545 [2024-04-19 10:25:58.633852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.545 [2024-04-19 10:25:58.633881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.805 #63 NEW cov: 11984 ft: 14988 corp: 30/1314b lim: 85 exec/s: 63 rss: 71Mb L: 20/85 MS: 1 InsertByte- 00:07:36.805 [2024-04-19 10:25:58.683976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.805 [2024-04-19 10:25:58.684005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.805 #64 pulse cov: 11984 ft: 14998 corp: 30/1314b lim: 85 exec/s: 32 rss: 71Mb 00:07:36.805 #64 NEW cov: 11984 ft: 14998 corp: 31/1333b lim: 85 exec/s: 32 rss: 71Mb L: 19/85 MS: 1 EraseBytes- 00:07:36.805 #64 DONE cov: 11984 ft: 14998 corp: 31/1333b lim: 85 exec/s: 32 rss: 71Mb 00:07:36.805 ###### Recommended dictionary. ###### 00:07:36.805 "\377\377" # Uses: 2 00:07:36.805 ###### End of recommended dictionary. ###### 00:07:36.805 Done 64 runs in 2 second(s) 00:07:36.805 10:25:58 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:07:36.805 10:25:58 -- ../common.sh@72 -- # (( i++ )) 00:07:36.805 10:25:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.805 10:25:58 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:07:36.805 10:25:58 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:07:36.805 10:25:58 -- nvmf/run.sh@24 -- # local timen=1 00:07:36.805 10:25:58 -- nvmf/run.sh@25 -- # local core=0x1 00:07:36.805 10:25:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:36.805 10:25:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:07:36.805 10:25:58 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:36.805 10:25:58 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:36.805 10:25:58 -- nvmf/run.sh@34 -- # printf %02d 23 00:07:36.805 10:25:58 -- nvmf/run.sh@34 -- # port=4423 00:07:36.805 10:25:58 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:36.805 10:25:58 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:07:36.805 10:25:58 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:36.805 10:25:58 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:36.805 10:25:58 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:36.805 10:25:58 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:07:36.805 [2024-04-19 10:25:58.886395] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:36.805 [2024-04-19 10:25:58.886464] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid208179 ] 00:07:37.064 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.064 [2024-04-19 10:25:59.070945] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.064 [2024-04-19 10:25:59.138755] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.323 [2024-04-19 10:25:59.197930] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.323 [2024-04-19 10:25:59.214060] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:07:37.323 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.323 INFO: Seed: 1864018496 00:07:37.323 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:37.323 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:37.323 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:37.323 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.323 #2 INITED exec/s: 0 rss: 63Mb 00:07:37.323 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.323 This may also happen if the target rejected all inputs we tried so far 00:07:37.323 [2024-04-19 10:25:59.258912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:37.323 [2024-04-19 10:25:59.258945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.323 [2024-04-19 10:25:59.258995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:37.323 [2024-04-19 10:25:59.259014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.323 [2024-04-19 10:25:59.259046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:37.323 [2024-04-19 10:25:59.259062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.582 NEW_FUNC[1/671]: 0x4ac6c0 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:07:37.582 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:37.582 #5 NEW cov: 11673 ft: 11674 corp: 2/16b lim: 25 exec/s: 0 rss: 69Mb L: 15/15 MS: 3 CopyPart-CrossOver-InsertRepeatedBytes- 00:07:37.582 [2024-04-19 10:25:59.599719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:37.582 [2024-04-19 10:25:59.599759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.582 [2024-04-19 10:25:59.599822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:37.582 [2024-04-19 10:25:59.599840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.582 [2024-04-19 10:25:59.599872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:37.582 [2024-04-19 10:25:59.599888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.582 #6 NEW cov: 11803 ft: 12163 corp: 3/31b lim: 25 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 CopyPart- 00:07:37.582 [2024-04-19 10:25:59.669731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:37.582 [2024-04-19 10:25:59.669763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.582 [2024-04-19 10:25:59.669822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:37.582 [2024-04-19 10:25:59.669844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.841 #7 NEW cov: 11809 ft: 12613 corp: 4/41b lim: 25 exec/s: 0 rss: 69Mb L: 10/15 MS: 1 EraseBytes- 00:07:37.841 [2024-04-19 10:25:59.739884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:37.841 [2024-04-19 10:25:59.739913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.841 [2024-04-19 10:25:59.739963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:37.841 [2024-04-19 10:25:59.739981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.841 #8 NEW cov: 11894 ft: 12830 corp: 5/54b lim: 25 exec/s: 0 rss: 69Mb L: 13/15 MS: 1 EraseBytes- 00:07:37.841 [2024-04-19 10:25:59.790038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:37.841 [2024-04-19 10:25:59.790067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.841 [2024-04-19 10:25:59.790116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:37.841 [2024-04-19 10:25:59.790135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.841 #9 NEW cov: 11894 ft: 13054 corp: 6/67b lim: 25 exec/s: 0 rss: 70Mb L: 13/15 MS: 1 CrossOver- 00:07:37.841 [2024-04-19 10:25:59.860324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:37.841 [2024-04-19 10:25:59.860353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.841 [2024-04-19 10:25:59.860400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:37.841 [2024-04-19 10:25:59.860418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.841 [2024-04-19 10:25:59.860450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:37.841 [2024-04-19 10:25:59.860466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.841 [2024-04-19 10:25:59.860495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:37.841 [2024-04-19 10:25:59.860511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.841 [2024-04-19 10:25:59.860541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:37.841 [2024-04-19 10:25:59.860557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:37.841 #10 NEW cov: 11894 ft: 13594 corp: 7/92b lim: 25 exec/s: 0 rss: 70Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:37.841 [2024-04-19 10:25:59.920421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:37.841 [2024-04-19 10:25:59.920449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.841 [2024-04-19 10:25:59.920498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:37.841 [2024-04-19 10:25:59.920516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.841 [2024-04-19 10:25:59.920547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:37.841 [2024-04-19 10:25:59.920563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.100 #11 NEW cov: 11894 ft: 13638 corp: 8/107b lim: 25 exec/s: 0 rss: 70Mb L: 15/25 MS: 1 ShuffleBytes- 00:07:38.100 [2024-04-19 10:25:59.970562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.100 [2024-04-19 10:25:59.970591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:25:59.970639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.100 [2024-04-19 10:25:59.970657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:25:59.970688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.100 [2024-04-19 10:25:59.970704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:25:59.970734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.100 [2024-04-19 10:25:59.970750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.100 #12 NEW cov: 11894 ft: 13709 corp: 9/130b lim: 25 exec/s: 0 rss: 70Mb L: 23/25 MS: 1 CMP- DE: "\357\303(\002\000\000\000\000"- 00:07:38.100 [2024-04-19 10:26:00.040792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.100 [2024-04-19 10:26:00.040838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:26:00.040874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.100 [2024-04-19 10:26:00.040892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:26:00.040924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.100 [2024-04-19 10:26:00.040940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.100 #13 NEW cov: 11894 ft: 13725 corp: 10/145b lim: 25 exec/s: 0 rss: 70Mb L: 15/25 MS: 1 ChangeBit- 00:07:38.100 [2024-04-19 10:26:00.101037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.100 [2024-04-19 10:26:00.101075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:26:00.101112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.100 [2024-04-19 10:26:00.101130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:26:00.101162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.100 [2024-04-19 10:26:00.101179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:26:00.101209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.100 [2024-04-19 10:26:00.101226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.100 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.100 #14 NEW cov: 11917 ft: 13828 corp: 11/168b lim: 25 exec/s: 0 rss: 70Mb L: 23/25 MS: 1 ChangeByte- 00:07:38.100 [2024-04-19 10:26:00.171175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.100 [2024-04-19 10:26:00.171207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:26:00.171246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.100 [2024-04-19 10:26:00.171265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:26:00.171296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.100 [2024-04-19 10:26:00.171313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.100 [2024-04-19 10:26:00.171343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.100 [2024-04-19 10:26:00.171360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.359 #15 NEW cov: 11917 ft: 13867 corp: 12/192b lim: 25 exec/s: 0 rss: 70Mb L: 24/25 MS: 1 InsertRepeatedBytes- 00:07:38.359 [2024-04-19 10:26:00.241359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.359 [2024-04-19 10:26:00.241391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.359 [2024-04-19 10:26:00.241426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.359 [2024-04-19 10:26:00.241444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.359 [2024-04-19 10:26:00.241476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.359 [2024-04-19 10:26:00.241492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.359 [2024-04-19 10:26:00.241523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.359 [2024-04-19 10:26:00.241539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.359 #16 NEW cov: 11917 ft: 13901 corp: 13/215b lim: 25 exec/s: 16 rss: 70Mb L: 23/25 MS: 1 ChangeBit- 00:07:38.359 [2024-04-19 10:26:00.311527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.359 [2024-04-19 10:26:00.311558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.359 [2024-04-19 10:26:00.311591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.359 [2024-04-19 10:26:00.311608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.359 [2024-04-19 10:26:00.311640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.359 [2024-04-19 10:26:00.311656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.359 [2024-04-19 10:26:00.311686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.359 [2024-04-19 10:26:00.311702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.359 #17 NEW cov: 11917 ft: 13933 corp: 14/239b lim: 25 exec/s: 17 rss: 70Mb L: 24/25 MS: 1 ChangeBinInt- 00:07:38.359 [2024-04-19 10:26:00.381573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.359 [2024-04-19 10:26:00.381604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.359 [2024-04-19 10:26:00.381638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.359 [2024-04-19 10:26:00.381656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.359 #18 NEW cov: 11917 ft: 13949 corp: 15/252b lim: 25 exec/s: 18 rss: 70Mb L: 13/25 MS: 1 PersAutoDict- DE: "\357\303(\002\000\000\000\000"- 00:07:38.359 [2024-04-19 10:26:00.431758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.359 [2024-04-19 10:26:00.431790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.359 [2024-04-19 10:26:00.431833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.359 [2024-04-19 10:26:00.431852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.618 #19 NEW cov: 11917 ft: 13968 corp: 16/265b lim: 25 exec/s: 19 rss: 71Mb L: 13/25 MS: 1 ChangeBinInt- 00:07:38.618 [2024-04-19 10:26:00.502027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.618 [2024-04-19 10:26:00.502057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.502107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.618 [2024-04-19 10:26:00.502125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.502157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.618 [2024-04-19 10:26:00.502174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.502204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.618 [2024-04-19 10:26:00.502221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.618 #20 NEW cov: 11917 ft: 14009 corp: 17/288b lim: 25 exec/s: 20 rss: 71Mb L: 23/25 MS: 1 InsertRepeatedBytes- 00:07:38.618 [2024-04-19 10:26:00.552202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.618 [2024-04-19 10:26:00.552233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.552267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.618 [2024-04-19 10:26:00.552285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.552317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.618 [2024-04-19 10:26:00.552333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.552364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.618 [2024-04-19 10:26:00.552381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.618 #21 NEW cov: 11917 ft: 14071 corp: 18/311b lim: 25 exec/s: 21 rss: 71Mb L: 23/25 MS: 1 PersAutoDict- DE: "\357\303(\002\000\000\000\000"- 00:07:38.618 [2024-04-19 10:26:00.602253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.618 [2024-04-19 10:26:00.602285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.602319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.618 [2024-04-19 10:26:00.602336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.602376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.618 [2024-04-19 10:26:00.602393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.602423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.618 [2024-04-19 10:26:00.602440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.618 #22 NEW cov: 11917 ft: 14097 corp: 19/334b lim: 25 exec/s: 22 rss: 71Mb L: 23/25 MS: 1 CrossOver- 00:07:38.618 [2024-04-19 10:26:00.672462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.618 [2024-04-19 10:26:00.672492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.672541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.618 [2024-04-19 10:26:00.672559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.672591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.618 [2024-04-19 10:26:00.672608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.672638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.618 [2024-04-19 10:26:00.672655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.618 #23 NEW cov: 11917 ft: 14110 corp: 20/357b lim: 25 exec/s: 23 rss: 71Mb L: 23/25 MS: 1 ChangeByte- 00:07:38.618 [2024-04-19 10:26:00.722614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.618 [2024-04-19 10:26:00.722644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.722678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.618 [2024-04-19 10:26:00.722696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.722728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.618 [2024-04-19 10:26:00.722745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.722774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.618 [2024-04-19 10:26:00.722791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.618 [2024-04-19 10:26:00.722831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:38.618 [2024-04-19 10:26:00.722849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:38.876 #24 NEW cov: 11917 ft: 14123 corp: 21/382b lim: 25 exec/s: 24 rss: 71Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:38.876 [2024-04-19 10:26:00.782807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.876 [2024-04-19 10:26:00.782847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.876 [2024-04-19 10:26:00.782879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.876 [2024-04-19 10:26:00.782897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.876 [2024-04-19 10:26:00.782933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.876 [2024-04-19 10:26:00.782949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.782978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.877 [2024-04-19 10:26:00.782994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.783023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:38.877 [2024-04-19 10:26:00.783039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:38.877 #25 NEW cov: 11917 ft: 14150 corp: 22/407b lim: 25 exec/s: 25 rss: 71Mb L: 25/25 MS: 1 InsertByte- 00:07:38.877 [2024-04-19 10:26:00.842965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.877 [2024-04-19 10:26:00.842996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.843029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.877 [2024-04-19 10:26:00.843048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.843080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.877 [2024-04-19 10:26:00.843096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.843127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.877 [2024-04-19 10:26:00.843143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.843174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:38.877 [2024-04-19 10:26:00.843190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:38.877 #26 NEW cov: 11917 ft: 14183 corp: 23/432b lim: 25 exec/s: 26 rss: 71Mb L: 25/25 MS: 1 ChangeByte- 00:07:38.877 [2024-04-19 10:26:00.913175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.877 [2024-04-19 10:26:00.913207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.913240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.877 [2024-04-19 10:26:00.913258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.913291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.877 [2024-04-19 10:26:00.913308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.913338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:38.877 [2024-04-19 10:26:00.913355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.913385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:38.877 [2024-04-19 10:26:00.913402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:38.877 #27 NEW cov: 11917 ft: 14206 corp: 24/457b lim: 25 exec/s: 27 rss: 71Mb L: 25/25 MS: 1 CrossOver- 00:07:38.877 [2024-04-19 10:26:00.983227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.877 [2024-04-19 10:26:00.983257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.983291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:38.877 [2024-04-19 10:26:00.983309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.877 [2024-04-19 10:26:00.983341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:38.877 [2024-04-19 10:26:00.983358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.135 #28 NEW cov: 11917 ft: 14215 corp: 25/472b lim: 25 exec/s: 28 rss: 71Mb L: 15/25 MS: 1 ChangeBinInt- 00:07:39.135 [2024-04-19 10:26:01.033289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.135 [2024-04-19 10:26:01.033319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.033369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:39.135 [2024-04-19 10:26:01.033387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.135 #29 NEW cov: 11917 ft: 14242 corp: 26/486b lim: 25 exec/s: 29 rss: 72Mb L: 14/25 MS: 1 InsertByte- 00:07:39.135 [2024-04-19 10:26:01.083468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.135 [2024-04-19 10:26:01.083497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.083547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:39.135 [2024-04-19 10:26:01.083565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.083598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:39.135 [2024-04-19 10:26:01.083615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.135 #30 NEW cov: 11917 ft: 14264 corp: 27/501b lim: 25 exec/s: 30 rss: 72Mb L: 15/25 MS: 1 ChangeBit- 00:07:39.135 [2024-04-19 10:26:01.133575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.135 [2024-04-19 10:26:01.133604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.133653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:39.135 [2024-04-19 10:26:01.133671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.133702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:39.135 [2024-04-19 10:26:01.133718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.133748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:39.135 [2024-04-19 10:26:01.133764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:39.135 #31 NEW cov: 11917 ft: 14313 corp: 28/525b lim: 25 exec/s: 31 rss: 72Mb L: 24/25 MS: 1 CopyPart- 00:07:39.135 [2024-04-19 10:26:01.203787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.135 [2024-04-19 10:26:01.203825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.203874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:39.135 [2024-04-19 10:26:01.203892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.203923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:39.135 [2024-04-19 10:26:01.203940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.135 [2024-04-19 10:26:01.203969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:39.135 [2024-04-19 10:26:01.203985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:39.135 #32 NEW cov: 11917 ft: 14321 corp: 29/548b lim: 25 exec/s: 32 rss: 72Mb L: 23/25 MS: 1 CMP- DE: "\000\000\000\366"- 00:07:39.394 [2024-04-19 10:26:01.253976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.394 [2024-04-19 10:26:01.254004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.394 [2024-04-19 10:26:01.254053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:39.394 [2024-04-19 10:26:01.254071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.394 [2024-04-19 10:26:01.254102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:39.394 [2024-04-19 10:26:01.254118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.394 [2024-04-19 10:26:01.254147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:39.394 [2024-04-19 10:26:01.254163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:39.394 [2024-04-19 10:26:01.254193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:39.394 [2024-04-19 10:26:01.254209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:39.394 #33 NEW cov: 11917 ft: 14354 corp: 30/573b lim: 25 exec/s: 16 rss: 72Mb L: 25/25 MS: 1 CrossOver- 00:07:39.394 #33 DONE cov: 11917 ft: 14354 corp: 30/573b lim: 25 exec/s: 16 rss: 72Mb 00:07:39.394 ###### Recommended dictionary. ###### 00:07:39.394 "\357\303(\002\000\000\000\000" # Uses: 2 00:07:39.394 "\000\000\000\366" # Uses: 0 00:07:39.394 ###### End of recommended dictionary. ###### 00:07:39.394 Done 33 runs in 2 second(s) 00:07:39.394 10:26:01 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:07:39.394 10:26:01 -- ../common.sh@72 -- # (( i++ )) 00:07:39.394 10:26:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.394 10:26:01 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:07:39.394 10:26:01 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:07:39.394 10:26:01 -- nvmf/run.sh@24 -- # local timen=1 00:07:39.394 10:26:01 -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.394 10:26:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:39.394 10:26:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:07:39.394 10:26:01 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:39.394 10:26:01 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:39.394 10:26:01 -- nvmf/run.sh@34 -- # printf %02d 24 00:07:39.394 10:26:01 -- nvmf/run.sh@34 -- # port=4424 00:07:39.394 10:26:01 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:39.394 10:26:01 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:07:39.394 10:26:01 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.394 10:26:01 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:39.394 10:26:01 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:39.394 10:26:01 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:07:39.394 [2024-04-19 10:26:01.449931] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:39.394 [2024-04-19 10:26:01.450001] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid208530 ] 00:07:39.394 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.654 [2024-04-19 10:26:01.633499] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.654 [2024-04-19 10:26:01.702356] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.654 [2024-04-19 10:26:01.762027] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:39.913 [2024-04-19 10:26:01.778159] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:07:39.913 INFO: Running with entropic power schedule (0xFF, 100). 00:07:39.913 INFO: Seed: 132076613 00:07:39.913 INFO: Loaded 1 modules (348156 inline 8-bit counters): 348156 [0x28ac78c, 0x2901788), 00:07:39.913 INFO: Loaded 1 PC tables (348156 PCs): 348156 [0x2901788,0x2e51748), 00:07:39.913 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:39.913 INFO: A corpus is not provided, starting from an empty corpus 00:07:39.913 #2 INITED exec/s: 0 rss: 63Mb 00:07:39.913 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:39.913 This may also happen if the target rejected all inputs we tried so far 00:07:39.913 [2024-04-19 10:26:01.849098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.913 [2024-04-19 10:26:01.849133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.913 [2024-04-19 10:26:01.849229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.913 [2024-04-19 10:26:01.849251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.913 [2024-04-19 10:26:01.849343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.913 [2024-04-19 10:26:01.849361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.171 NEW_FUNC[1/672]: 0x4ad7a0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:07:40.171 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.171 #8 NEW cov: 11745 ft: 11746 corp: 2/69b lim: 100 exec/s: 0 rss: 69Mb L: 68/68 MS: 1 InsertRepeatedBytes- 00:07:40.171 [2024-04-19 10:26:02.180418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.180461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.171 [2024-04-19 10:26:02.180552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.180576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.171 [2024-04-19 10:26:02.180668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.180686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.171 [2024-04-19 10:26:02.180772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.180791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.171 #9 NEW cov: 11875 ft: 12662 corp: 3/162b lim: 100 exec/s: 0 rss: 69Mb L: 93/93 MS: 1 InsertRepeatedBytes- 00:07:40.171 [2024-04-19 10:26:02.230183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412750542232763709 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.230212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.171 [2024-04-19 10:26:02.230269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.230285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.171 [2024-04-19 10:26:02.230351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.230371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.171 #12 NEW cov: 11881 ft: 12962 corp: 4/236b lim: 100 exec/s: 0 rss: 69Mb L: 74/93 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:40.171 [2024-04-19 10:26:02.280926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.280954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.171 [2024-04-19 10:26:02.281017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.281035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.171 [2024-04-19 10:26:02.281097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.281114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.171 [2024-04-19 10:26:02.281206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.171 [2024-04-19 10:26:02.281222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.430 #13 NEW cov: 11966 ft: 13222 corp: 5/329b lim: 100 exec/s: 0 rss: 69Mb L: 93/93 MS: 1 ChangeBit- 00:07:40.430 [2024-04-19 10:26:02.340685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412730751023463741 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.340712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.430 [2024-04-19 10:26:02.340782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.340800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.430 [2024-04-19 10:26:02.340883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.340901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.430 #14 NEW cov: 11966 ft: 13318 corp: 6/403b lim: 100 exec/s: 0 rss: 69Mb L: 74/93 MS: 1 ChangeByte- 00:07:40.430 [2024-04-19 10:26:02.400487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.400515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.430 [2024-04-19 10:26:02.400620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.400637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.430 #15 NEW cov: 11966 ft: 13733 corp: 7/459b lim: 100 exec/s: 0 rss: 70Mb L: 56/93 MS: 1 EraseBytes- 00:07:40.430 [2024-04-19 10:26:02.461062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412730751023463741 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.461090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.430 [2024-04-19 10:26:02.461161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.461177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.430 [2024-04-19 10:26:02.461253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.461270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.430 #16 NEW cov: 11966 ft: 13806 corp: 8/534b lim: 100 exec/s: 0 rss: 70Mb L: 75/93 MS: 1 InsertByte- 00:07:40.430 [2024-04-19 10:26:02.520689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.430 [2024-04-19 10:26:02.520716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.690 #20 NEW cov: 11966 ft: 14610 corp: 9/561b lim: 100 exec/s: 0 rss: 70Mb L: 27/93 MS: 4 InsertByte-ChangeByte-CrossOver-CrossOver- 00:07:40.690 [2024-04-19 10:26:02.571814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.571841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.571928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.571946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.572022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.572038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.572125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.572147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.690 #21 NEW cov: 11966 ft: 14659 corp: 10/654b lim: 100 exec/s: 0 rss: 70Mb L: 93/93 MS: 1 ChangeBit- 00:07:40.690 [2024-04-19 10:26:02.621914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.621941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.622017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.622033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.622103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.622120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.622217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.622236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.690 #22 NEW cov: 11966 ft: 14693 corp: 11/747b lim: 100 exec/s: 0 rss: 70Mb L: 93/93 MS: 1 ChangeBit- 00:07:40.690 [2024-04-19 10:26:02.672191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.672219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.672308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.672325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.672384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.672403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.672487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.672503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.690 #23 NEW cov: 11966 ft: 14723 corp: 12/840b lim: 100 exec/s: 0 rss: 70Mb L: 93/93 MS: 1 ShuffleBytes- 00:07:40.690 [2024-04-19 10:26:02.732353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.732381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.732449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.732477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.732549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.732569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.732661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.732679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.690 NEW_FUNC[1/1]: 0x19be010 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:40.690 #24 NEW cov: 11989 ft: 14800 corp: 13/933b lim: 100 exec/s: 0 rss: 70Mb L: 93/93 MS: 1 ChangeBit- 00:07:40.690 [2024-04-19 10:26:02.782167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412730751023463741 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.782195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.782255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.782273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.690 [2024-04-19 10:26:02.782343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.690 [2024-04-19 10:26:02.782362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.950 #25 NEW cov: 11989 ft: 14843 corp: 14/1008b lim: 100 exec/s: 0 rss: 70Mb L: 75/93 MS: 1 InsertByte- 00:07:40.950 [2024-04-19 10:26:02.832714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.832742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:02.832821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.832839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:02.832909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.832927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:02.833012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.833031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.950 #26 NEW cov: 11989 ft: 14873 corp: 15/1101b lim: 100 exec/s: 26 rss: 70Mb L: 93/93 MS: 1 ChangeBinInt- 00:07:40.950 [2024-04-19 10:26:02.892550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412730751023463741 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.892576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:02.892645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4412750543122677053 len:11838 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.892664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:02.892731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.892748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.950 #27 NEW cov: 11989 ft: 14916 corp: 16/1169b lim: 100 exec/s: 27 rss: 70Mb L: 68/93 MS: 1 EraseBytes- 00:07:40.950 [2024-04-19 10:26:02.953135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.953164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:02.953245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.953264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:02.953333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.953351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:02.953445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:02.953461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.950 #28 NEW cov: 11989 ft: 14935 corp: 17/1262b lim: 100 exec/s: 28 rss: 70Mb L: 93/93 MS: 1 CopyPart- 00:07:40.950 [2024-04-19 10:26:03.013438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:03.013466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.950 [2024-04-19 10:26:03.013542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.950 [2024-04-19 10:26:03.013560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.951 [2024-04-19 10:26:03.013634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.951 [2024-04-19 10:26:03.013655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.951 [2024-04-19 10:26:03.013745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.951 [2024-04-19 10:26:03.013762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.951 #29 NEW cov: 11989 ft: 14952 corp: 18/1355b lim: 100 exec/s: 29 rss: 70Mb L: 93/93 MS: 1 ShuffleBytes- 00:07:41.222 [2024-04-19 10:26:03.063498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.222 [2024-04-19 10:26:03.063528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.063584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.063603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.063650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.063668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.063758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:41635 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.063781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.223 #30 NEW cov: 11989 ft: 14972 corp: 19/1454b lim: 100 exec/s: 30 rss: 71Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:07:41.223 [2024-04-19 10:26:03.123880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.123909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.123976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.123995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.124059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.124079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.124164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:41635 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.124181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.223 #31 NEW cov: 11989 ft: 14989 corp: 20/1553b lim: 100 exec/s: 31 rss: 71Mb L: 99/99 MS: 1 ChangeBit- 00:07:41.223 [2024-04-19 10:26:03.183841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.183870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.183937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.183965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.184026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.184043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.223 #32 NEW cov: 11989 ft: 15004 corp: 21/1621b lim: 100 exec/s: 32 rss: 71Mb L: 68/99 MS: 1 ChangeBinInt- 00:07:41.223 [2024-04-19 10:26:03.234268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.234297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.234373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.234391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.234463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.234485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.234578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:41635 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.234595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.223 #33 NEW cov: 11989 ft: 15028 corp: 22/1720b lim: 100 exec/s: 33 rss: 71Mb L: 99/99 MS: 1 ShuffleBytes- 00:07:41.223 [2024-04-19 10:26:03.294314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.294344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.294424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.294444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.294497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.294516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.223 [2024-04-19 10:26:03.294622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:296352743424 len:41635 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-04-19 10:26:03.294643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.223 #34 NEW cov: 11989 ft: 15045 corp: 23/1819b lim: 100 exec/s: 34 rss: 71Mb L: 99/99 MS: 1 ChangeByte- 00:07:41.490 [2024-04-19 10:26:03.345007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.345035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.345112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.345128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.345196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.345213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.345301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:296352743424 len:41635 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.345319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.345413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.345431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.490 #35 NEW cov: 11989 ft: 15094 corp: 24/1919b lim: 100 exec/s: 35 rss: 71Mb L: 100/100 MS: 1 InsertByte- 00:07:41.490 [2024-04-19 10:26:03.405148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.405176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.405255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.405272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.405326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.405345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.405430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:296352743424 len:41635 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.405450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.405536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.405555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.490 #36 NEW cov: 11989 ft: 15133 corp: 25/2019b lim: 100 exec/s: 36 rss: 72Mb L: 100/100 MS: 1 ShuffleBytes- 00:07:41.490 [2024-04-19 10:26:03.464973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.465000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.465096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.465113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.465183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:10995116277760 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.465200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.465296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.465315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.490 #37 NEW cov: 11989 ft: 15145 corp: 26/2109b lim: 100 exec/s: 37 rss: 72Mb L: 90/100 MS: 1 CopyPart- 00:07:41.490 [2024-04-19 10:26:03.524870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.524899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.524956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:17665 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.524972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.490 [2024-04-19 10:26:03.525043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:63744 len:127 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.490 [2024-04-19 10:26:03.525060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.490 #38 NEW cov: 11989 ft: 15153 corp: 27/2172b lim: 100 exec/s: 38 rss: 72Mb L: 63/100 MS: 1 EraseBytes- 00:07:41.491 [2024-04-19 10:26:03.575035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412730751023463741 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.491 [2024-04-19 10:26:03.575063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.491 [2024-04-19 10:26:03.575131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.491 [2024-04-19 10:26:03.575152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.491 [2024-04-19 10:26:03.575212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.491 [2024-04-19 10:26:03.575229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.750 #39 NEW cov: 11989 ft: 15174 corp: 28/2247b lim: 100 exec/s: 39 rss: 72Mb L: 75/100 MS: 1 ChangeByte- 00:07:41.750 [2024-04-19 10:26:03.635015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.635041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.635110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.635129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.750 #40 NEW cov: 11989 ft: 15244 corp: 29/2303b lim: 100 exec/s: 40 rss: 72Mb L: 56/100 MS: 1 ChangeByte- 00:07:41.750 [2024-04-19 10:26:03.685478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412730751023463741 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.685504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.685582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.685598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.685667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.685686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.750 #41 NEW cov: 11989 ft: 15253 corp: 30/2378b lim: 100 exec/s: 41 rss: 72Mb L: 75/100 MS: 1 ChangeBit- 00:07:41.750 [2024-04-19 10:26:03.735380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.735408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.735467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.735485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.750 #42 NEW cov: 11989 ft: 15262 corp: 31/2434b lim: 100 exec/s: 42 rss: 72Mb L: 56/100 MS: 1 ChangeByte- 00:07:41.750 [2024-04-19 10:26:03.785853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4412750542232763691 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.785881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.785946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.785963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.786044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.786065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.750 #43 NEW cov: 11989 ft: 15267 corp: 32/2509b lim: 100 exec/s: 43 rss: 72Mb L: 75/100 MS: 1 ShuffleBytes- 00:07:41.750 [2024-04-19 10:26:03.836345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18014398509481984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.836373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.836458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.836474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.836554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.836571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.750 [2024-04-19 10:26:03.836661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:41635 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.750 [2024-04-19 10:26:03.836678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.750 #44 NEW cov: 11989 ft: 15269 corp: 33/2608b lim: 100 exec/s: 22 rss: 72Mb L: 99/100 MS: 1 ChangeBit- 00:07:41.750 #44 DONE cov: 11989 ft: 15269 corp: 33/2608b lim: 100 exec/s: 22 rss: 72Mb 00:07:41.750 Done 44 runs in 2 second(s) 00:07:42.009 10:26:03 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:07:42.010 10:26:03 -- ../common.sh@72 -- # (( i++ )) 00:07:42.010 10:26:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.010 10:26:03 -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:07:42.010 00:07:42.010 real 1m5.426s 00:07:42.010 user 1m40.621s 00:07:42.010 sys 0m8.238s 00:07:42.010 10:26:03 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:42.010 10:26:03 -- common/autotest_common.sh@10 -- # set +x 00:07:42.010 ************************************ 00:07:42.010 END TEST nvmf_fuzz 00:07:42.010 ************************************ 00:07:42.010 10:26:04 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:42.010 10:26:04 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:42.010 10:26:04 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:42.010 10:26:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:42.010 10:26:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:42.010 10:26:04 -- common/autotest_common.sh@10 -- # set +x 00:07:42.271 ************************************ 00:07:42.271 START TEST vfio_fuzz 00:07:42.271 ************************************ 00:07:42.271 10:26:04 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:42.271 * Looking for test storage... 00:07:42.271 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.271 10:26:04 -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:42.271 10:26:04 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:42.271 10:26:04 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:42.271 10:26:04 -- common/autotest_common.sh@34 -- # set -e 00:07:42.271 10:26:04 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:42.271 10:26:04 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:42.271 10:26:04 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:42.271 10:26:04 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:42.271 10:26:04 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:42.271 10:26:04 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:42.271 10:26:04 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:42.271 10:26:04 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:42.271 10:26:04 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:42.271 10:26:04 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:42.271 10:26:04 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:42.271 10:26:04 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:42.271 10:26:04 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:42.271 10:26:04 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:42.271 10:26:04 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:42.271 10:26:04 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:42.271 10:26:04 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:42.271 10:26:04 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:42.271 10:26:04 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:42.271 10:26:04 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:42.271 10:26:04 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:42.271 10:26:04 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:42.271 10:26:04 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:42.271 10:26:04 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:42.271 10:26:04 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:42.271 10:26:04 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:42.271 10:26:04 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:42.271 10:26:04 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:42.271 10:26:04 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:42.271 10:26:04 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:42.271 10:26:04 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:42.271 10:26:04 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:42.271 10:26:04 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:42.271 10:26:04 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:42.271 10:26:04 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:42.271 10:26:04 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:42.271 10:26:04 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:42.271 10:26:04 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:42.271 10:26:04 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:42.271 10:26:04 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:42.271 10:26:04 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:42.271 10:26:04 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:42.271 10:26:04 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:42.271 10:26:04 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:42.271 10:26:04 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:42.271 10:26:04 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:42.271 10:26:04 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:42.271 10:26:04 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:42.271 10:26:04 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:42.271 10:26:04 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:42.271 10:26:04 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:42.271 10:26:04 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:42.271 10:26:04 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:42.271 10:26:04 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:42.271 10:26:04 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:42.271 10:26:04 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:42.271 10:26:04 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:42.271 10:26:04 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:07:42.271 10:26:04 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:07:42.271 10:26:04 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:07:42.271 10:26:04 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:07:42.271 10:26:04 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:07:42.271 10:26:04 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:07:42.271 10:26:04 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:07:42.271 10:26:04 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:07:42.271 10:26:04 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:07:42.271 10:26:04 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR= 00:07:42.271 10:26:04 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:07:42.271 10:26:04 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:07:42.271 10:26:04 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:07:42.271 10:26:04 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:07:42.271 10:26:04 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:07:42.271 10:26:04 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:42.271 10:26:04 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:07:42.271 10:26:04 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:07:42.271 10:26:04 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:07:42.271 10:26:04 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:07:42.271 10:26:04 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:07:42.271 10:26:04 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:07:42.271 10:26:04 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:07:42.271 10:26:04 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:07:42.271 10:26:04 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:07:42.271 10:26:04 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:07:42.271 10:26:04 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:07:42.271 10:26:04 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:42.271 10:26:04 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:07:42.271 10:26:04 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:07:42.271 10:26:04 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:42.271 10:26:04 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:42.271 10:26:04 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:42.271 10:26:04 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:42.271 10:26:04 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:42.271 10:26:04 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:42.271 10:26:04 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:42.271 10:26:04 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:42.271 10:26:04 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:42.271 10:26:04 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:42.271 10:26:04 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:42.271 10:26:04 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:42.271 10:26:04 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:42.271 10:26:04 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:42.271 10:26:04 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:42.271 10:26:04 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:42.272 #define SPDK_CONFIG_H 00:07:42.272 #define SPDK_CONFIG_APPS 1 00:07:42.272 #define SPDK_CONFIG_ARCH native 00:07:42.272 #undef SPDK_CONFIG_ASAN 00:07:42.272 #undef SPDK_CONFIG_AVAHI 00:07:42.272 #undef SPDK_CONFIG_CET 00:07:42.272 #define SPDK_CONFIG_COVERAGE 1 00:07:42.272 #define SPDK_CONFIG_CROSS_PREFIX 00:07:42.272 #undef SPDK_CONFIG_CRYPTO 00:07:42.272 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:42.272 #undef SPDK_CONFIG_CUSTOMOCF 00:07:42.272 #undef SPDK_CONFIG_DAOS 00:07:42.272 #define SPDK_CONFIG_DAOS_DIR 00:07:42.272 #define SPDK_CONFIG_DEBUG 1 00:07:42.272 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:42.272 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:42.272 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:42.272 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:42.272 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:42.272 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:42.272 #define SPDK_CONFIG_EXAMPLES 1 00:07:42.272 #undef SPDK_CONFIG_FC 00:07:42.272 #define SPDK_CONFIG_FC_PATH 00:07:42.272 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:42.272 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:42.272 #undef SPDK_CONFIG_FUSE 00:07:42.272 #define SPDK_CONFIG_FUZZER 1 00:07:42.272 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:42.272 #undef SPDK_CONFIG_GOLANG 00:07:42.272 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:42.272 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:42.272 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:42.272 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:07:42.272 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:42.272 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:42.272 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:42.272 #define SPDK_CONFIG_IDXD 1 00:07:42.272 #undef SPDK_CONFIG_IDXD_KERNEL 00:07:42.272 #undef SPDK_CONFIG_IPSEC_MB 00:07:42.272 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:42.272 #define SPDK_CONFIG_ISAL 1 00:07:42.272 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:42.272 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:42.272 #define SPDK_CONFIG_LIBDIR 00:07:42.272 #undef SPDK_CONFIG_LTO 00:07:42.272 #define SPDK_CONFIG_MAX_LCORES 00:07:42.272 #define SPDK_CONFIG_NVME_CUSE 1 00:07:42.272 #undef SPDK_CONFIG_OCF 00:07:42.272 #define SPDK_CONFIG_OCF_PATH 00:07:42.272 #define SPDK_CONFIG_OPENSSL_PATH 00:07:42.272 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:42.272 #define SPDK_CONFIG_PGO_DIR 00:07:42.272 #undef SPDK_CONFIG_PGO_USE 00:07:42.272 #define SPDK_CONFIG_PREFIX /usr/local 00:07:42.272 #undef SPDK_CONFIG_RAID5F 00:07:42.272 #undef SPDK_CONFIG_RBD 00:07:42.272 #define SPDK_CONFIG_RDMA 1 00:07:42.272 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:42.272 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:42.272 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:42.272 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:42.272 #undef SPDK_CONFIG_SHARED 00:07:42.272 #undef SPDK_CONFIG_SMA 00:07:42.272 #define SPDK_CONFIG_TESTS 1 00:07:42.272 #undef SPDK_CONFIG_TSAN 00:07:42.272 #define SPDK_CONFIG_UBLK 1 00:07:42.272 #define SPDK_CONFIG_UBSAN 1 00:07:42.272 #undef SPDK_CONFIG_UNIT_TESTS 00:07:42.272 #undef SPDK_CONFIG_URING 00:07:42.272 #define SPDK_CONFIG_URING_PATH 00:07:42.272 #undef SPDK_CONFIG_URING_ZNS 00:07:42.272 #undef SPDK_CONFIG_USDT 00:07:42.272 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:42.272 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:42.272 #define SPDK_CONFIG_VFIO_USER 1 00:07:42.272 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:42.272 #define SPDK_CONFIG_VHOST 1 00:07:42.272 #define SPDK_CONFIG_VIRTIO 1 00:07:42.272 #undef SPDK_CONFIG_VTUNE 00:07:42.272 #define SPDK_CONFIG_VTUNE_DIR 00:07:42.272 #define SPDK_CONFIG_WERROR 1 00:07:42.272 #define SPDK_CONFIG_WPDK_DIR 00:07:42.272 #undef SPDK_CONFIG_XNVME 00:07:42.272 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:42.272 10:26:04 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:42.272 10:26:04 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:42.272 10:26:04 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:42.272 10:26:04 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:42.272 10:26:04 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:42.272 10:26:04 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:42.272 10:26:04 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:42.272 10:26:04 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:42.272 10:26:04 -- paths/export.sh@5 -- # export PATH 00:07:42.272 10:26:04 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:42.272 10:26:04 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:42.272 10:26:04 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:42.272 10:26:04 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:42.272 10:26:04 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:42.272 10:26:04 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:42.272 10:26:04 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:42.272 10:26:04 -- pm/common@67 -- # TEST_TAG=N/A 00:07:42.272 10:26:04 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:42.272 10:26:04 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:42.272 10:26:04 -- pm/common@71 -- # uname -s 00:07:42.272 10:26:04 -- pm/common@71 -- # PM_OS=Linux 00:07:42.272 10:26:04 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:42.272 10:26:04 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:07:42.272 10:26:04 -- pm/common@76 -- # [[ Linux == Linux ]] 00:07:42.272 10:26:04 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:07:42.272 10:26:04 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:07:42.272 10:26:04 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:42.272 10:26:04 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:42.272 10:26:04 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:07:42.272 10:26:04 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:07:42.272 10:26:04 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:42.272 10:26:04 -- common/autotest_common.sh@57 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:07:42.272 10:26:04 -- common/autotest_common.sh@61 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:42.272 10:26:04 -- common/autotest_common.sh@63 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:07:42.272 10:26:04 -- common/autotest_common.sh@65 -- # : 1 00:07:42.272 10:26:04 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:42.272 10:26:04 -- common/autotest_common.sh@67 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:07:42.272 10:26:04 -- common/autotest_common.sh@69 -- # : 00:07:42.272 10:26:04 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:07:42.272 10:26:04 -- common/autotest_common.sh@71 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:07:42.272 10:26:04 -- common/autotest_common.sh@73 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:07:42.272 10:26:04 -- common/autotest_common.sh@75 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:07:42.272 10:26:04 -- common/autotest_common.sh@77 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:42.272 10:26:04 -- common/autotest_common.sh@79 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:07:42.272 10:26:04 -- common/autotest_common.sh@81 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:07:42.272 10:26:04 -- common/autotest_common.sh@83 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:07:42.272 10:26:04 -- common/autotest_common.sh@85 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:07:42.272 10:26:04 -- common/autotest_common.sh@87 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:07:42.272 10:26:04 -- common/autotest_common.sh@89 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:07:42.272 10:26:04 -- common/autotest_common.sh@91 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:07:42.272 10:26:04 -- common/autotest_common.sh@93 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:07:42.272 10:26:04 -- common/autotest_common.sh@95 -- # : 0 00:07:42.272 10:26:04 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:42.272 10:26:04 -- common/autotest_common.sh@97 -- # : 1 00:07:42.273 10:26:04 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:07:42.273 10:26:04 -- common/autotest_common.sh@99 -- # : 1 00:07:42.273 10:26:04 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:07:42.273 10:26:04 -- common/autotest_common.sh@101 -- # : rdma 00:07:42.273 10:26:04 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:42.273 10:26:04 -- common/autotest_common.sh@103 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:07:42.273 10:26:04 -- common/autotest_common.sh@105 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:07:42.273 10:26:04 -- common/autotest_common.sh@107 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:07:42.273 10:26:04 -- common/autotest_common.sh@109 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:07:42.273 10:26:04 -- common/autotest_common.sh@111 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:07:42.273 10:26:04 -- common/autotest_common.sh@113 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:07:42.273 10:26:04 -- common/autotest_common.sh@115 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:07:42.273 10:26:04 -- common/autotest_common.sh@117 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:42.273 10:26:04 -- common/autotest_common.sh@119 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:07:42.273 10:26:04 -- common/autotest_common.sh@121 -- # : 1 00:07:42.273 10:26:04 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:07:42.273 10:26:04 -- common/autotest_common.sh@123 -- # : 00:07:42.273 10:26:04 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:42.273 10:26:04 -- common/autotest_common.sh@125 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:07:42.273 10:26:04 -- common/autotest_common.sh@127 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:07:42.273 10:26:04 -- common/autotest_common.sh@129 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:07:42.273 10:26:04 -- common/autotest_common.sh@131 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:07:42.273 10:26:04 -- common/autotest_common.sh@133 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:07:42.273 10:26:04 -- common/autotest_common.sh@135 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:07:42.273 10:26:04 -- common/autotest_common.sh@137 -- # : 00:07:42.273 10:26:04 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:07:42.273 10:26:04 -- common/autotest_common.sh@139 -- # : true 00:07:42.273 10:26:04 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:07:42.273 10:26:04 -- common/autotest_common.sh@141 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:07:42.273 10:26:04 -- common/autotest_common.sh@143 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:07:42.273 10:26:04 -- common/autotest_common.sh@145 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:07:42.273 10:26:04 -- common/autotest_common.sh@147 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:07:42.273 10:26:04 -- common/autotest_common.sh@149 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:07:42.273 10:26:04 -- common/autotest_common.sh@151 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:07:42.273 10:26:04 -- common/autotest_common.sh@153 -- # : 00:07:42.273 10:26:04 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:07:42.273 10:26:04 -- common/autotest_common.sh@155 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:07:42.273 10:26:04 -- common/autotest_common.sh@157 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:07:42.273 10:26:04 -- common/autotest_common.sh@159 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:07:42.273 10:26:04 -- common/autotest_common.sh@161 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:07:42.273 10:26:04 -- common/autotest_common.sh@163 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:07:42.273 10:26:04 -- common/autotest_common.sh@166 -- # : 00:07:42.273 10:26:04 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:07:42.273 10:26:04 -- common/autotest_common.sh@168 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:07:42.273 10:26:04 -- common/autotest_common.sh@170 -- # : 0 00:07:42.273 10:26:04 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:42.273 10:26:04 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:42.273 10:26:04 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:42.273 10:26:04 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:42.273 10:26:04 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:42.273 10:26:04 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:42.273 10:26:04 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:42.273 10:26:04 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:42.273 10:26:04 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:42.273 10:26:04 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:42.273 10:26:04 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:42.273 10:26:04 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:42.273 10:26:04 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:42.273 10:26:04 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:42.273 10:26:04 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:07:42.273 10:26:04 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:42.273 10:26:04 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:42.273 10:26:04 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:42.273 10:26:04 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:42.273 10:26:04 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:42.273 10:26:04 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:07:42.273 10:26:04 -- common/autotest_common.sh@199 -- # cat 00:07:42.273 10:26:04 -- common/autotest_common.sh@225 -- # echo leak:libfuse3.so 00:07:42.273 10:26:04 -- common/autotest_common.sh@227 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:42.273 10:26:04 -- common/autotest_common.sh@227 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:42.273 10:26:04 -- common/autotest_common.sh@229 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:42.273 10:26:04 -- common/autotest_common.sh@229 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:42.273 10:26:04 -- common/autotest_common.sh@231 -- # '[' -z /var/spdk/dependencies ']' 00:07:42.273 10:26:04 -- common/autotest_common.sh@234 -- # export DEPENDENCY_DIR 00:07:42.273 10:26:04 -- common/autotest_common.sh@238 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:42.273 10:26:04 -- common/autotest_common.sh@238 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:42.273 10:26:04 -- common/autotest_common.sh@239 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:42.273 10:26:04 -- common/autotest_common.sh@239 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:42.273 10:26:04 -- common/autotest_common.sh@242 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:42.273 10:26:04 -- common/autotest_common.sh@242 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:42.273 10:26:04 -- common/autotest_common.sh@243 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:42.273 10:26:04 -- common/autotest_common.sh@243 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:42.273 10:26:04 -- common/autotest_common.sh@245 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:42.274 10:26:04 -- common/autotest_common.sh@245 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:42.274 10:26:04 -- common/autotest_common.sh@248 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:42.274 10:26:04 -- common/autotest_common.sh@248 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:42.274 10:26:04 -- common/autotest_common.sh@251 -- # '[' 0 -eq 0 ']' 00:07:42.274 10:26:04 -- common/autotest_common.sh@252 -- # export valgrind= 00:07:42.274 10:26:04 -- common/autotest_common.sh@252 -- # valgrind= 00:07:42.274 10:26:04 -- common/autotest_common.sh@258 -- # uname -s 00:07:42.274 10:26:04 -- common/autotest_common.sh@258 -- # '[' Linux = Linux ']' 00:07:42.274 10:26:04 -- common/autotest_common.sh@259 -- # HUGEMEM=4096 00:07:42.274 10:26:04 -- common/autotest_common.sh@260 -- # export CLEAR_HUGE=yes 00:07:42.274 10:26:04 -- common/autotest_common.sh@260 -- # CLEAR_HUGE=yes 00:07:42.274 10:26:04 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:07:42.274 10:26:04 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:07:42.274 10:26:04 -- common/autotest_common.sh@268 -- # MAKE=make 00:07:42.274 10:26:04 -- common/autotest_common.sh@269 -- # MAKEFLAGS=-j72 00:07:42.274 10:26:04 -- common/autotest_common.sh@285 -- # export HUGEMEM=4096 00:07:42.274 10:26:04 -- common/autotest_common.sh@285 -- # HUGEMEM=4096 00:07:42.274 10:26:04 -- common/autotest_common.sh@287 -- # NO_HUGE=() 00:07:42.274 10:26:04 -- common/autotest_common.sh@288 -- # TEST_MODE= 00:07:42.274 10:26:04 -- common/autotest_common.sh@307 -- # [[ -z 208919 ]] 00:07:42.274 10:26:04 -- common/autotest_common.sh@307 -- # kill -0 208919 00:07:42.274 10:26:04 -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:07:42.274 10:26:04 -- common/autotest_common.sh@317 -- # [[ -v testdir ]] 00:07:42.274 10:26:04 -- common/autotest_common.sh@319 -- # local requested_size=2147483648 00:07:42.274 10:26:04 -- common/autotest_common.sh@320 -- # local mount target_dir 00:07:42.274 10:26:04 -- common/autotest_common.sh@322 -- # local -A mounts fss sizes avails uses 00:07:42.274 10:26:04 -- common/autotest_common.sh@323 -- # local source fs size avail mount use 00:07:42.274 10:26:04 -- common/autotest_common.sh@325 -- # local storage_fallback storage_candidates 00:07:42.274 10:26:04 -- common/autotest_common.sh@327 -- # mktemp -udt spdk.XXXXXX 00:07:42.274 10:26:04 -- common/autotest_common.sh@327 -- # storage_fallback=/tmp/spdk.q27MWc 00:07:42.274 10:26:04 -- common/autotest_common.sh@332 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:42.274 10:26:04 -- common/autotest_common.sh@334 -- # [[ -n '' ]] 00:07:42.274 10:26:04 -- common/autotest_common.sh@339 -- # [[ -n '' ]] 00:07:42.274 10:26:04 -- common/autotest_common.sh@344 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.q27MWc/tests/vfio /tmp/spdk.q27MWc 00:07:42.274 10:26:04 -- common/autotest_common.sh@347 -- # requested_size=2214592512 00:07:42.274 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.274 10:26:04 -- common/autotest_common.sh@316 -- # df -T 00:07:42.274 10:26:04 -- common/autotest_common.sh@316 -- # grep -v Filesystem 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_devtmpfs 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # fss["$mount"]=devtmpfs 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # avails["$mount"]=67108864 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # sizes["$mount"]=67108864 00:07:42.274 10:26:04 -- common/autotest_common.sh@352 -- # uses["$mount"]=0 00:07:42.274 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/pmem0 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # fss["$mount"]=ext2 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # avails["$mount"]=995880960 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # sizes["$mount"]=5284429824 00:07:42.274 10:26:04 -- common/autotest_common.sh@352 -- # uses["$mount"]=4288548864 00:07:42.274 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_root 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # fss["$mount"]=overlay 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # avails["$mount"]=54697521152 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # sizes["$mount"]=61742718976 00:07:42.274 10:26:04 -- common/autotest_common.sh@352 -- # uses["$mount"]=7045197824 00:07:42.274 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/sda1 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # fss["$mount"]=xfs 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # avails["$mount"]=221821267968 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # sizes["$mount"]=239938535424 00:07:42.274 10:26:04 -- common/autotest_common.sh@352 -- # uses["$mount"]=18117267456 00:07:42.274 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:42.274 10:26:04 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:42.274 10:26:04 -- common/autotest_common.sh@351 -- # avails["$mount"]=30870081536 00:07:42.533 10:26:04 -- common/autotest_common.sh@351 -- # sizes["$mount"]=30871359488 00:07:42.533 10:26:04 -- common/autotest_common.sh@352 -- # uses["$mount"]=1277952 00:07:42.533 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.533 10:26:04 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:42.533 10:26:04 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:42.533 10:26:04 -- common/autotest_common.sh@351 -- # avails["$mount"]=12342808576 00:07:42.533 10:26:04 -- common/autotest_common.sh@351 -- # sizes["$mount"]=12348547072 00:07:42.533 10:26:04 -- common/autotest_common.sh@352 -- # uses["$mount"]=5738496 00:07:42.533 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.533 10:26:04 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:42.533 10:26:04 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:42.533 10:26:04 -- common/autotest_common.sh@351 -- # avails["$mount"]=30870974464 00:07:42.533 10:26:04 -- common/autotest_common.sh@351 -- # sizes["$mount"]=30871359488 00:07:42.533 10:26:04 -- common/autotest_common.sh@352 -- # uses["$mount"]=385024 00:07:42.533 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.533 10:26:04 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:42.533 10:26:04 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:42.533 10:26:04 -- common/autotest_common.sh@351 -- # avails["$mount"]=6174265344 00:07:42.533 10:26:04 -- common/autotest_common.sh@351 -- # sizes["$mount"]=6174269440 00:07:42.533 10:26:04 -- common/autotest_common.sh@352 -- # uses["$mount"]=4096 00:07:42.533 10:26:04 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:42.533 10:26:04 -- common/autotest_common.sh@355 -- # printf '* Looking for test storage...\n' 00:07:42.534 * Looking for test storage... 00:07:42.534 10:26:04 -- common/autotest_common.sh@357 -- # local target_space new_size 00:07:42.534 10:26:04 -- common/autotest_common.sh@358 -- # for target_dir in "${storage_candidates[@]}" 00:07:42.534 10:26:04 -- common/autotest_common.sh@361 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.534 10:26:04 -- common/autotest_common.sh@361 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:42.534 10:26:04 -- common/autotest_common.sh@361 -- # mount=/ 00:07:42.534 10:26:04 -- common/autotest_common.sh@363 -- # target_space=54697521152 00:07:42.534 10:26:04 -- common/autotest_common.sh@364 -- # (( target_space == 0 || target_space < requested_size )) 00:07:42.534 10:26:04 -- common/autotest_common.sh@367 -- # (( target_space >= requested_size )) 00:07:42.534 10:26:04 -- common/autotest_common.sh@369 -- # [[ overlay == tmpfs ]] 00:07:42.534 10:26:04 -- common/autotest_common.sh@369 -- # [[ overlay == ramfs ]] 00:07:42.534 10:26:04 -- common/autotest_common.sh@369 -- # [[ / == / ]] 00:07:42.534 10:26:04 -- common/autotest_common.sh@370 -- # new_size=9259790336 00:07:42.534 10:26:04 -- common/autotest_common.sh@371 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:42.534 10:26:04 -- common/autotest_common.sh@376 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.534 10:26:04 -- common/autotest_common.sh@376 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.534 10:26:04 -- common/autotest_common.sh@377 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.534 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.534 10:26:04 -- common/autotest_common.sh@378 -- # return 0 00:07:42.534 10:26:04 -- common/autotest_common.sh@1668 -- # set -o errtrace 00:07:42.534 10:26:04 -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:07:42.534 10:26:04 -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:42.534 10:26:04 -- common/autotest_common.sh@1672 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:42.534 10:26:04 -- common/autotest_common.sh@1673 -- # true 00:07:42.534 10:26:04 -- common/autotest_common.sh@1675 -- # xtrace_fd 00:07:42.534 10:26:04 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:42.534 10:26:04 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:42.534 10:26:04 -- common/autotest_common.sh@27 -- # exec 00:07:42.534 10:26:04 -- common/autotest_common.sh@29 -- # exec 00:07:42.534 10:26:04 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:42.534 10:26:04 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:42.534 10:26:04 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:42.534 10:26:04 -- common/autotest_common.sh@18 -- # set -x 00:07:42.534 10:26:04 -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:07:42.534 10:26:04 -- ../common.sh@8 -- # pids=() 00:07:42.534 10:26:04 -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:07:42.534 10:26:04 -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:07:42.534 10:26:04 -- vfio/run.sh@68 -- # fuzz_num=7 00:07:42.534 10:26:04 -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:07:42.534 10:26:04 -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:42.534 10:26:04 -- vfio/run.sh@74 -- # mem_size=0 00:07:42.534 10:26:04 -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:07:42.534 10:26:04 -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:07:42.534 10:26:04 -- ../common.sh@69 -- # local fuzz_num=7 00:07:42.534 10:26:04 -- ../common.sh@70 -- # local time=1 00:07:42.534 10:26:04 -- ../common.sh@72 -- # (( i = 0 )) 00:07:42.534 10:26:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.534 10:26:04 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:42.534 10:26:04 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:07:42.534 10:26:04 -- vfio/run.sh@23 -- # local timen=1 00:07:42.534 10:26:04 -- vfio/run.sh@24 -- # local core=0x1 00:07:42.534 10:26:04 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:42.534 10:26:04 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:07:42.534 10:26:04 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:07:42.534 10:26:04 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:07:42.534 10:26:04 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:07:42.534 10:26:04 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:07:42.534 10:26:04 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:07:42.534 10:26:04 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:42.534 10:26:04 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:07:42.534 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:42.534 10:26:04 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:42.534 10:26:04 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:07:42.534 10:26:04 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:07:42.534 [2024-04-19 10:26:04.442792] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:42.534 [2024-04-19 10:26:04.442887] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid209079 ] 00:07:42.534 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.534 [2024-04-19 10:26:04.517801] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.534 [2024-04-19 10:26:04.593330] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.793 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.793 INFO: Seed: 3118052547 00:07:42.793 INFO: Loaded 1 modules (345392 inline 8-bit counters): 345392 [0x286cf8c, 0x28c14bc), 00:07:42.793 INFO: Loaded 1 PC tables (345392 PCs): 345392 [0x28c14c0,0x2e067c0), 00:07:42.793 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:42.793 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.793 #2 INITED exec/s: 0 rss: 64Mb 00:07:42.793 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.793 This may also happen if the target rejected all inputs we tried so far 00:07:42.793 [2024-04-19 10:26:04.830129] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:07:43.309 NEW_FUNC[1/634]: 0x481720 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:07:43.309 NEW_FUNC[2/634]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:43.309 #41 NEW cov: 10791 ft: 10543 corp: 2/7b lim: 6 exec/s: 0 rss: 69Mb L: 6/6 MS: 4 ShuffleBytes-InsertRepeatedBytes-CopyPart-CopyPart- 00:07:43.309 #42 NEW cov: 10805 ft: 13483 corp: 3/13b lim: 6 exec/s: 0 rss: 70Mb L: 6/6 MS: 1 ChangeByte- 00:07:43.568 #43 NEW cov: 10805 ft: 13982 corp: 4/19b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 ChangeByte- 00:07:43.568 #44 NEW cov: 10808 ft: 14580 corp: 5/25b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 CrossOver- 00:07:43.826 NEW_FUNC[1/1]: 0x198a540 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.826 #45 NEW cov: 10825 ft: 14899 corp: 6/31b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:07:43.826 #46 NEW cov: 10825 ft: 15026 corp: 7/37b lim: 6 exec/s: 46 rss: 72Mb L: 6/6 MS: 1 CopyPart- 00:07:43.826 #47 NEW cov: 10825 ft: 15117 corp: 8/43b lim: 6 exec/s: 47 rss: 72Mb L: 6/6 MS: 1 ChangeBinInt- 00:07:44.084 #52 NEW cov: 10825 ft: 15467 corp: 9/49b lim: 6 exec/s: 52 rss: 72Mb L: 6/6 MS: 5 EraseBytes-CopyPart-ChangeBinInt-ChangeByte-InsertByte- 00:07:44.084 #58 NEW cov: 10825 ft: 16365 corp: 10/55b lim: 6 exec/s: 58 rss: 72Mb L: 6/6 MS: 1 ChangeBit- 00:07:44.342 #64 NEW cov: 10825 ft: 16569 corp: 11/61b lim: 6 exec/s: 64 rss: 72Mb L: 6/6 MS: 1 ChangeByte- 00:07:44.342 #67 NEW cov: 10825 ft: 17098 corp: 12/67b lim: 6 exec/s: 67 rss: 72Mb L: 6/6 MS: 3 CrossOver-ChangeBit-InsertByte- 00:07:44.600 #68 NEW cov: 10825 ft: 17128 corp: 13/73b lim: 6 exec/s: 68 rss: 72Mb L: 6/6 MS: 1 CopyPart- 00:07:44.600 #69 NEW cov: 10832 ft: 17317 corp: 14/79b lim: 6 exec/s: 69 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:07:44.858 #70 NEW cov: 10832 ft: 17402 corp: 15/85b lim: 6 exec/s: 70 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:07:44.858 #71 NEW cov: 10832 ft: 17441 corp: 16/91b lim: 6 exec/s: 35 rss: 72Mb L: 6/6 MS: 1 CopyPart- 00:07:44.858 #71 DONE cov: 10832 ft: 17441 corp: 16/91b lim: 6 exec/s: 35 rss: 72Mb 00:07:44.858 Done 71 runs in 2 second(s) 00:07:44.858 [2024-04-19 10:26:06.863004] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:07:45.117 10:26:07 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:07:45.117 10:26:07 -- ../common.sh@72 -- # (( i++ )) 00:07:45.117 10:26:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.117 10:26:07 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:45.117 10:26:07 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:07:45.117 10:26:07 -- vfio/run.sh@23 -- # local timen=1 00:07:45.117 10:26:07 -- vfio/run.sh@24 -- # local core=0x1 00:07:45.117 10:26:07 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:45.117 10:26:07 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:07:45.117 10:26:07 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:07:45.117 10:26:07 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:07:45.117 10:26:07 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:07:45.117 10:26:07 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:07:45.117 10:26:07 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:07:45.117 10:26:07 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:45.117 10:26:07 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:07:45.117 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:45.117 10:26:07 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:45.117 10:26:07 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:07:45.118 10:26:07 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:07:45.118 [2024-04-19 10:26:07.133439] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:45.118 [2024-04-19 10:26:07.133505] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid209454 ] 00:07:45.118 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.118 [2024-04-19 10:26:07.202397] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.376 [2024-04-19 10:26:07.283755] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.376 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.376 INFO: Seed: 1511084253 00:07:45.635 INFO: Loaded 1 modules (345392 inline 8-bit counters): 345392 [0x286cf8c, 0x28c14bc), 00:07:45.635 INFO: Loaded 1 PC tables (345392 PCs): 345392 [0x28c14c0,0x2e067c0), 00:07:45.635 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:45.635 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.635 #2 INITED exec/s: 0 rss: 64Mb 00:07:45.635 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.635 This may also happen if the target rejected all inputs we tried so far 00:07:45.635 [2024-04-19 10:26:07.520830] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:07:45.635 [2024-04-19 10:26:07.599603] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:45.635 [2024-04-19 10:26:07.599629] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:45.635 [2024-04-19 10:26:07.599648] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:45.893 NEW_FUNC[1/635]: 0x481cc0 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:07:45.893 NEW_FUNC[2/635]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:45.893 #44 NEW cov: 10787 ft: 10325 corp: 2/5b lim: 4 exec/s: 0 rss: 69Mb L: 4/4 MS: 2 CopyPart-CopyPart- 00:07:46.151 [2024-04-19 10:26:08.097120] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:46.151 [2024-04-19 10:26:08.097157] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:46.151 [2024-04-19 10:26:08.097175] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:46.151 NEW_FUNC[1/1]: 0x16566e0 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1128 00:07:46.151 #45 NEW cov: 10804 ft: 13117 corp: 3/9b lim: 4 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 CrossOver- 00:07:46.409 [2024-04-19 10:26:08.314057] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:46.409 [2024-04-19 10:26:08.314084] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:46.409 [2024-04-19 10:26:08.314102] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:46.409 NEW_FUNC[1/1]: 0x198a540 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.409 #54 NEW cov: 10821 ft: 14804 corp: 4/13b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 4 CrossOver-InsertByte-ChangeBinInt-CrossOver- 00:07:46.666 [2024-04-19 10:26:08.525832] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:46.666 [2024-04-19 10:26:08.525855] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:46.666 [2024-04-19 10:26:08.525889] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:46.666 #60 NEW cov: 10821 ft: 15331 corp: 5/17b lim: 4 exec/s: 60 rss: 72Mb L: 4/4 MS: 1 ChangeBit- 00:07:46.666 [2024-04-19 10:26:08.725434] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:46.666 [2024-04-19 10:26:08.725455] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:46.666 [2024-04-19 10:26:08.725473] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:46.924 #61 NEW cov: 10821 ft: 15584 corp: 6/21b lim: 4 exec/s: 61 rss: 72Mb L: 4/4 MS: 1 ChangeBinInt- 00:07:46.924 [2024-04-19 10:26:08.927968] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:46.924 [2024-04-19 10:26:08.927990] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:46.924 [2024-04-19 10:26:08.928008] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:47.182 #62 NEW cov: 10821 ft: 15668 corp: 7/25b lim: 4 exec/s: 62 rss: 72Mb L: 4/4 MS: 1 ChangeBinInt- 00:07:47.182 [2024-04-19 10:26:09.132240] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:47.182 [2024-04-19 10:26:09.132262] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:47.182 [2024-04-19 10:26:09.132280] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:47.182 #63 NEW cov: 10821 ft: 16070 corp: 8/29b lim: 4 exec/s: 63 rss: 72Mb L: 4/4 MS: 1 ChangeBit- 00:07:47.439 [2024-04-19 10:26:09.339295] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:47.439 [2024-04-19 10:26:09.339318] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:47.439 [2024-04-19 10:26:09.339336] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:47.439 #64 NEW cov: 10828 ft: 16409 corp: 9/33b lim: 4 exec/s: 64 rss: 72Mb L: 4/4 MS: 1 ChangeByte- 00:07:47.439 [2024-04-19 10:26:09.548024] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:47.439 [2024-04-19 10:26:09.548046] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:47.439 [2024-04-19 10:26:09.548064] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:47.698 #65 NEW cov: 10828 ft: 16647 corp: 10/37b lim: 4 exec/s: 32 rss: 72Mb L: 4/4 MS: 1 ChangeByte- 00:07:47.698 #65 DONE cov: 10828 ft: 16647 corp: 10/37b lim: 4 exec/s: 32 rss: 72Mb 00:07:47.698 Done 65 runs in 2 second(s) 00:07:47.698 [2024-04-19 10:26:09.686010] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:07:47.956 10:26:09 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:07:47.956 10:26:09 -- ../common.sh@72 -- # (( i++ )) 00:07:47.956 10:26:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.956 10:26:09 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:47.956 10:26:09 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:07:47.956 10:26:09 -- vfio/run.sh@23 -- # local timen=1 00:07:47.956 10:26:09 -- vfio/run.sh@24 -- # local core=0x1 00:07:47.956 10:26:09 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:47.956 10:26:09 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:07:47.956 10:26:09 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:07:47.956 10:26:09 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:07:47.956 10:26:09 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:07:47.956 10:26:09 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:07:47.956 10:26:09 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:07:47.956 10:26:09 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:47.956 10:26:09 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:07:47.956 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:47.956 10:26:09 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:47.956 10:26:09 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:07:47.956 10:26:09 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:07:47.956 [2024-04-19 10:26:09.974064] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:47.956 [2024-04-19 10:26:09.974136] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid209820 ] 00:07:47.956 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.956 [2024-04-19 10:26:10.047921] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.214 [2024-04-19 10:26:10.131814] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.214 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.214 INFO: Seed: 70130067 00:07:48.472 INFO: Loaded 1 modules (345392 inline 8-bit counters): 345392 [0x286cf8c, 0x28c14bc), 00:07:48.472 INFO: Loaded 1 PC tables (345392 PCs): 345392 [0x28c14c0,0x2e067c0), 00:07:48.472 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:48.472 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.472 #2 INITED exec/s: 0 rss: 63Mb 00:07:48.472 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.472 This may also happen if the target rejected all inputs we tried so far 00:07:48.472 [2024-04-19 10:26:10.376803] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:07:48.472 [2024-04-19 10:26:10.400201] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:48.732 NEW_FUNC[1/635]: 0x4826a0 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:07:48.732 NEW_FUNC[2/635]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:48.732 #24 NEW cov: 10770 ft: 10569 corp: 2/9b lim: 8 exec/s: 0 rss: 69Mb L: 8/8 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:07:48.732 [2024-04-19 10:26:10.824429] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:48.990 #25 NEW cov: 10784 ft: 13141 corp: 3/17b lim: 8 exec/s: 0 rss: 70Mb L: 8/8 MS: 1 ShuffleBytes- 00:07:48.990 [2024-04-19 10:26:10.939245] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:48.990 #26 NEW cov: 10787 ft: 14132 corp: 4/25b lim: 8 exec/s: 0 rss: 71Mb L: 8/8 MS: 1 ChangeBinInt- 00:07:48.990 [2024-04-19 10:26:11.053192] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.249 #27 NEW cov: 10787 ft: 14608 corp: 5/33b lim: 8 exec/s: 0 rss: 71Mb L: 8/8 MS: 1 ChangeBit- 00:07:49.249 [2024-04-19 10:26:11.167262] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.249 NEW_FUNC[1/1]: 0x198a540 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.249 #28 NEW cov: 10804 ft: 15039 corp: 6/41b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 CopyPart- 00:07:49.249 [2024-04-19 10:26:11.281218] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.507 #38 NEW cov: 10804 ft: 16150 corp: 7/49b lim: 8 exec/s: 38 rss: 72Mb L: 8/8 MS: 5 InsertByte-CopyPart-InsertRepeatedBytes-ChangeByte-CopyPart- 00:07:49.507 [2024-04-19 10:26:11.405106] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.507 #39 NEW cov: 10804 ft: 16336 corp: 8/57b lim: 8 exec/s: 39 rss: 72Mb L: 8/8 MS: 1 ChangeBit- 00:07:49.507 [2024-04-19 10:26:11.519514] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.507 #40 NEW cov: 10804 ft: 16501 corp: 9/65b lim: 8 exec/s: 40 rss: 72Mb L: 8/8 MS: 1 CopyPart- 00:07:49.764 [2024-04-19 10:26:11.633667] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.764 #41 NEW cov: 10804 ft: 16826 corp: 10/73b lim: 8 exec/s: 41 rss: 72Mb L: 8/8 MS: 1 ChangeBit- 00:07:49.764 [2024-04-19 10:26:11.757408] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.764 #42 NEW cov: 10804 ft: 16892 corp: 11/81b lim: 8 exec/s: 42 rss: 72Mb L: 8/8 MS: 1 CMP- DE: "\377\377\377\377"- 00:07:49.764 [2024-04-19 10:26:11.871914] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.022 #43 NEW cov: 10804 ft: 17036 corp: 12/89b lim: 8 exec/s: 43 rss: 72Mb L: 8/8 MS: 1 CopyPart- 00:07:50.022 [2024-04-19 10:26:11.985918] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.022 #44 NEW cov: 10804 ft: 17136 corp: 13/97b lim: 8 exec/s: 44 rss: 72Mb L: 8/8 MS: 1 CopyPart- 00:07:50.022 [2024-04-19 10:26:12.100025] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.279 #45 NEW cov: 10811 ft: 17289 corp: 14/105b lim: 8 exec/s: 45 rss: 72Mb L: 8/8 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:50.279 [2024-04-19 10:26:12.215537] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.279 #46 NEW cov: 10811 ft: 17450 corp: 15/113b lim: 8 exec/s: 46 rss: 72Mb L: 8/8 MS: 1 ChangeByte- 00:07:50.279 [2024-04-19 10:26:12.329978] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.538 #47 NEW cov: 10811 ft: 17555 corp: 16/121b lim: 8 exec/s: 23 rss: 72Mb L: 8/8 MS: 1 ChangeBit- 00:07:50.538 #47 DONE cov: 10811 ft: 17555 corp: 16/121b lim: 8 exec/s: 23 rss: 72Mb 00:07:50.538 ###### Recommended dictionary. ###### 00:07:50.538 "\377\377\377\377" # Uses: 1 00:07:50.538 ###### End of recommended dictionary. ###### 00:07:50.538 Done 47 runs in 2 second(s) 00:07:50.538 [2024-04-19 10:26:12.422006] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:07:50.796 10:26:12 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:07:50.796 10:26:12 -- ../common.sh@72 -- # (( i++ )) 00:07:50.796 10:26:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.796 10:26:12 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:50.796 10:26:12 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:07:50.796 10:26:12 -- vfio/run.sh@23 -- # local timen=1 00:07:50.796 10:26:12 -- vfio/run.sh@24 -- # local core=0x1 00:07:50.796 10:26:12 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:50.796 10:26:12 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:07:50.796 10:26:12 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:07:50.796 10:26:12 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:07:50.796 10:26:12 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:07:50.796 10:26:12 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:07:50.796 10:26:12 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:07:50.796 10:26:12 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:50.796 10:26:12 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:07:50.796 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:50.796 10:26:12 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:50.796 10:26:12 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:07:50.796 10:26:12 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:07:50.796 [2024-04-19 10:26:12.700895] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:50.796 [2024-04-19 10:26:12.700952] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid210174 ] 00:07:50.796 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.796 [2024-04-19 10:26:12.770279] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.796 [2024-04-19 10:26:12.846651] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.056 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.056 INFO: Seed: 2780117581 00:07:51.056 INFO: Loaded 1 modules (345392 inline 8-bit counters): 345392 [0x286cf8c, 0x28c14bc), 00:07:51.056 INFO: Loaded 1 PC tables (345392 PCs): 345392 [0x28c14c0,0x2e067c0), 00:07:51.056 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:51.056 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.056 #2 INITED exec/s: 0 rss: 64Mb 00:07:51.056 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.056 This may also happen if the target rejected all inputs we tried so far 00:07:51.056 [2024-04-19 10:26:13.092094] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:07:51.056 [2024-04-19 10:26:13.147845] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 6293595036912670551 > max 8796093022208 00:07:51.056 [2024-04-19 10:26:13.147869] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x575757570a00ffff, 0xaeaeaeae61585756) offset=0x5757575757575757 flags=0x3: No space left on device 00:07:51.056 [2024-04-19 10:26:13.147880] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:07:51.056 [2024-04-19 10:26:13.147897] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:51.571 NEW_FUNC[1/636]: 0x482d80 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:07:51.571 NEW_FUNC[2/636]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:51.571 #42 NEW cov: 10784 ft: 10750 corp: 2/33b lim: 32 exec/s: 0 rss: 69Mb L: 32/32 MS: 5 CMP-EraseBytes-CrossOver-CMP-InsertRepeatedBytes- DE: "\001\000\000\000\000\000\000\001"-"\377\377\377\000"- 00:07:51.571 [2024-04-19 10:26:13.635358] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 6269046240799315799 > max 8796093022208 00:07:51.571 [2024-04-19 10:26:13.635394] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x575757570a00ffff, 0xae5777ae61585756) offset=0x5757575757575757 flags=0x3: No space left on device 00:07:51.571 [2024-04-19 10:26:13.635406] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:07:51.571 [2024-04-19 10:26:13.635438] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:51.830 #53 NEW cov: 10798 ft: 13577 corp: 3/65b lim: 32 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 ChangeBinInt- 00:07:51.830 [2024-04-19 10:26:13.836800] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 6293595036912670551 > max 8796093022208 00:07:51.830 [2024-04-19 10:26:13.836832] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x575757570a00ffff, 0xaeaeaeae61585756) offset=0x5757575757575757 flags=0x3: No space left on device 00:07:51.830 [2024-04-19 10:26:13.836844] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:07:51.830 [2024-04-19 10:26:13.836861] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:52.089 NEW_FUNC[1/1]: 0x198a540 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:52.089 #59 NEW cov: 10815 ft: 14886 corp: 4/97b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 1 ShuffleBytes- 00:07:52.089 [2024-04-19 10:26:14.037403] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 6269046240799315799 > max 8796093022208 00:07:52.089 [2024-04-19 10:26:14.037426] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x575757570a00ffff, 0xae5777ae61585756) offset=0x5757575757575757 flags=0x3: No space left on device 00:07:52.089 [2024-04-19 10:26:14.037438] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:07:52.089 [2024-04-19 10:26:14.037455] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:52.089 #60 NEW cov: 10815 ft: 15827 corp: 5/129b lim: 32 exec/s: 60 rss: 72Mb L: 32/32 MS: 1 ShuffleBytes- 00:07:52.347 [2024-04-19 10:26:14.237291] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x575757570a00ffff, 0x5757575861585756) fd=325 offset=0x5757575701000000 prot=0x3: Permission denied 00:07:52.347 [2024-04-19 10:26:14.237313] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x575757570a00ffff, 0x5757575861585756) offset=0x5757575701000000 flags=0x3: Permission denied 00:07:52.347 [2024-04-19 10:26:14.237323] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Permission denied 00:07:52.347 [2024-04-19 10:26:14.237356] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:52.347 #61 NEW cov: 10815 ft: 16232 corp: 6/161b lim: 32 exec/s: 61 rss: 72Mb L: 32/32 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:07:52.347 [2024-04-19 10:26:14.436975] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 6269046240799315799 > max 8796093022208 00:07:52.347 [2024-04-19 10:26:14.436997] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x57575700ffffffff, 0xae57775857575756) offset=0x5757575757575757 flags=0x3: No space left on device 00:07:52.347 [2024-04-19 10:26:14.437008] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:07:52.347 [2024-04-19 10:26:14.437024] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:52.606 #67 NEW cov: 10815 ft: 16904 corp: 7/193b lim: 32 exec/s: 67 rss: 72Mb L: 32/32 MS: 1 PersAutoDict- DE: "\377\377\377\000"- 00:07:52.606 [2024-04-19 10:26:14.634328] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 6293595036912663895 > max 8796093022208 00:07:52.606 [2024-04-19 10:26:14.634351] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x575757570a00ffff, 0xaeaeaeae61583d56) offset=0x5757575757575757 flags=0x3: No space left on device 00:07:52.606 [2024-04-19 10:26:14.634362] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:07:52.606 [2024-04-19 10:26:14.634380] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:52.865 #78 NEW cov: 10815 ft: 17037 corp: 8/225b lim: 32 exec/s: 78 rss: 72Mb L: 32/32 MS: 1 ChangeByte- 00:07:52.865 [2024-04-19 10:26:14.833998] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 6293595036912670551 > max 8796093022208 00:07:52.865 [2024-04-19 10:26:14.834021] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x575757570a00ffff, 0xaeaeaeae61585756) offset=0x5757575757575757 flags=0x3: No space left on device 00:07:52.865 [2024-04-19 10:26:14.834032] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:07:52.865 [2024-04-19 10:26:14.834048] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:52.865 #79 NEW cov: 10822 ft: 17064 corp: 9/257b lim: 32 exec/s: 79 rss: 72Mb L: 32/32 MS: 1 CopyPart- 00:07:53.124 [2024-04-19 10:26:15.032618] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 6293595036912650752 > max 8796093022208 00:07:53.124 [2024-04-19 10:26:15.032642] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0xffff57570a00ffff, 0x5756aeae615809ff) offset=0x5757570020575757 flags=0x3: No space left on device 00:07:53.124 [2024-04-19 10:26:15.032657] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:07:53.124 [2024-04-19 10:26:15.032674] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:53.124 #90 NEW cov: 10822 ft: 17385 corp: 10/289b lim: 32 exec/s: 45 rss: 72Mb L: 32/32 MS: 1 CrossOver- 00:07:53.124 #90 DONE cov: 10822 ft: 17385 corp: 10/289b lim: 32 exec/s: 45 rss: 72Mb 00:07:53.124 ###### Recommended dictionary. ###### 00:07:53.124 "\001\000\000\000\000\000\000\001" # Uses: 3 00:07:53.124 "\377\377\377\000" # Uses: 3 00:07:53.124 ###### End of recommended dictionary. ###### 00:07:53.124 Done 90 runs in 2 second(s) 00:07:53.124 [2024-04-19 10:26:15.174003] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:07:53.383 10:26:15 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:07:53.383 10:26:15 -- ../common.sh@72 -- # (( i++ )) 00:07:53.383 10:26:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.383 10:26:15 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:53.383 10:26:15 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:07:53.383 10:26:15 -- vfio/run.sh@23 -- # local timen=1 00:07:53.383 10:26:15 -- vfio/run.sh@24 -- # local core=0x1 00:07:53.383 10:26:15 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:53.383 10:26:15 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:07:53.383 10:26:15 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:07:53.383 10:26:15 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:07:53.383 10:26:15 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:07:53.383 10:26:15 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:07:53.383 10:26:15 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:07:53.383 10:26:15 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:53.383 10:26:15 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:07:53.383 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:53.383 10:26:15 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:53.383 10:26:15 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:07:53.383 10:26:15 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:07:53.383 [2024-04-19 10:26:15.450746] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:53.383 [2024-04-19 10:26:15.450802] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid210522 ] 00:07:53.383 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.642 [2024-04-19 10:26:15.520660] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.642 [2024-04-19 10:26:15.596734] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.902 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.902 INFO: Seed: 1234146181 00:07:53.902 INFO: Loaded 1 modules (345392 inline 8-bit counters): 345392 [0x286cf8c, 0x28c14bc), 00:07:53.902 INFO: Loaded 1 PC tables (345392 PCs): 345392 [0x28c14c0,0x2e067c0), 00:07:53.902 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:53.902 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.902 #2 INITED exec/s: 0 rss: 64Mb 00:07:53.902 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.902 This may also happen if the target rejected all inputs we tried so far 00:07:53.902 [2024-04-19 10:26:15.834643] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:07:54.161 NEW_FUNC[1/635]: 0x483600 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:07:54.161 NEW_FUNC[2/635]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:54.161 #272 NEW cov: 10780 ft: 10729 corp: 2/33b lim: 32 exec/s: 0 rss: 69Mb L: 32/32 MS: 5 InsertRepeatedBytes-ShuffleBytes-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:54.420 #273 NEW cov: 10794 ft: 13521 corp: 3/65b lim: 32 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 ChangeBit- 00:07:54.420 #274 NEW cov: 10794 ft: 14678 corp: 4/97b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 1 ShuffleBytes- 00:07:54.679 #280 NEW cov: 10797 ft: 15262 corp: 5/129b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeASCIIInt- 00:07:54.679 NEW_FUNC[1/1]: 0x198a540 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.679 #281 NEW cov: 10814 ft: 15334 corp: 6/161b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:07:54.938 #282 NEW cov: 10814 ft: 15712 corp: 7/193b lim: 32 exec/s: 282 rss: 72Mb L: 32/32 MS: 1 ChangeByte- 00:07:54.938 #283 NEW cov: 10814 ft: 15921 corp: 8/225b lim: 32 exec/s: 283 rss: 72Mb L: 32/32 MS: 1 CopyPart- 00:07:54.938 #284 NEW cov: 10814 ft: 16472 corp: 9/257b lim: 32 exec/s: 284 rss: 72Mb L: 32/32 MS: 1 ShuffleBytes- 00:07:55.196 #290 NEW cov: 10814 ft: 16499 corp: 10/289b lim: 32 exec/s: 290 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:07:55.196 #291 NEW cov: 10814 ft: 16633 corp: 11/321b lim: 32 exec/s: 291 rss: 72Mb L: 32/32 MS: 1 CopyPart- 00:07:55.454 #297 NEW cov: 10814 ft: 16669 corp: 12/353b lim: 32 exec/s: 297 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:07:55.454 #298 NEW cov: 10814 ft: 16722 corp: 13/385b lim: 32 exec/s: 298 rss: 72Mb L: 32/32 MS: 1 CrossOver- 00:07:55.713 #299 NEW cov: 10821 ft: 16737 corp: 14/417b lim: 32 exec/s: 299 rss: 72Mb L: 32/32 MS: 1 ChangeByte- 00:07:55.713 #300 NEW cov: 10821 ft: 16765 corp: 15/449b lim: 32 exec/s: 300 rss: 72Mb L: 32/32 MS: 1 CrossOver- 00:07:55.972 #301 NEW cov: 10821 ft: 16788 corp: 16/481b lim: 32 exec/s: 150 rss: 72Mb L: 32/32 MS: 1 CrossOver- 00:07:55.972 #301 DONE cov: 10821 ft: 16788 corp: 16/481b lim: 32 exec/s: 150 rss: 72Mb 00:07:55.972 Done 301 runs in 2 second(s) 00:07:55.972 [2024-04-19 10:26:17.862994] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:07:56.231 10:26:18 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:07:56.231 10:26:18 -- ../common.sh@72 -- # (( i++ )) 00:07:56.231 10:26:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.231 10:26:18 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:56.231 10:26:18 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:07:56.231 10:26:18 -- vfio/run.sh@23 -- # local timen=1 00:07:56.231 10:26:18 -- vfio/run.sh@24 -- # local core=0x1 00:07:56.231 10:26:18 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:56.231 10:26:18 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:07:56.231 10:26:18 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:07:56.231 10:26:18 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:07:56.231 10:26:18 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:07:56.231 10:26:18 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:07:56.231 10:26:18 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:07:56.231 10:26:18 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:56.231 10:26:18 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:07:56.231 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:56.231 10:26:18 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:56.231 10:26:18 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:07:56.231 10:26:18 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:07:56.231 [2024-04-19 10:26:18.144579] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:56.231 [2024-04-19 10:26:18.144650] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid210880 ] 00:07:56.231 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.231 [2024-04-19 10:26:18.215904] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.231 [2024-04-19 10:26:18.291732] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.490 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.490 INFO: Seed: 3931149538 00:07:56.490 INFO: Loaded 1 modules (345392 inline 8-bit counters): 345392 [0x286cf8c, 0x28c14bc), 00:07:56.490 INFO: Loaded 1 PC tables (345392 PCs): 345392 [0x28c14c0,0x2e067c0), 00:07:56.490 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:56.490 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.490 #2 INITED exec/s: 0 rss: 63Mb 00:07:56.490 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.490 This may also happen if the target rejected all inputs we tried so far 00:07:56.490 [2024-04-19 10:26:18.539434] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:07:56.749 [2024-04-19 10:26:18.619726] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:56.749 [2024-04-19 10:26:18.619763] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.007 NEW_FUNC[1/636]: 0x484000 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:07:57.007 NEW_FUNC[2/636]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:57.007 #13 NEW cov: 10786 ft: 10751 corp: 2/14b lim: 13 exec/s: 0 rss: 69Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:07:57.265 [2024-04-19 10:26:19.134563] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.265 [2024-04-19 10:26:19.134609] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.265 #19 NEW cov: 10802 ft: 13906 corp: 3/27b lim: 13 exec/s: 0 rss: 70Mb L: 13/13 MS: 1 ChangeBit- 00:07:57.265 [2024-04-19 10:26:19.327598] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.265 [2024-04-19 10:26:19.327630] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.523 NEW_FUNC[1/1]: 0x198a540 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.523 #22 NEW cov: 10819 ft: 15755 corp: 4/40b lim: 13 exec/s: 0 rss: 71Mb L: 13/13 MS: 3 EraseBytes-CrossOver-CrossOver- 00:07:57.523 [2024-04-19 10:26:19.544423] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.523 [2024-04-19 10:26:19.544454] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.782 #28 NEW cov: 10819 ft: 16257 corp: 5/53b lim: 13 exec/s: 28 rss: 72Mb L: 13/13 MS: 1 ChangeBit- 00:07:57.782 [2024-04-19 10:26:19.743583] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.782 [2024-04-19 10:26:19.743614] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.782 #47 NEW cov: 10819 ft: 16362 corp: 6/66b lim: 13 exec/s: 47 rss: 72Mb L: 13/13 MS: 4 EraseBytes-CopyPart-CopyPart-CrossOver- 00:07:58.039 [2024-04-19 10:26:19.947233] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.040 [2024-04-19 10:26:19.947266] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.040 #48 NEW cov: 10819 ft: 16740 corp: 7/79b lim: 13 exec/s: 48 rss: 72Mb L: 13/13 MS: 1 ChangeByte- 00:07:58.297 [2024-04-19 10:26:20.159482] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.297 [2024-04-19 10:26:20.159527] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.297 #49 NEW cov: 10819 ft: 17051 corp: 8/92b lim: 13 exec/s: 49 rss: 72Mb L: 13/13 MS: 1 ChangeBit- 00:07:58.297 [2024-04-19 10:26:20.363504] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.297 [2024-04-19 10:26:20.363536] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.556 #50 NEW cov: 10826 ft: 17375 corp: 9/105b lim: 13 exec/s: 50 rss: 72Mb L: 13/13 MS: 1 ChangeBit- 00:07:58.556 [2024-04-19 10:26:20.561744] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.556 [2024-04-19 10:26:20.561774] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.814 #51 NEW cov: 10826 ft: 17636 corp: 10/118b lim: 13 exec/s: 25 rss: 72Mb L: 13/13 MS: 1 ChangeBit- 00:07:58.814 #51 DONE cov: 10826 ft: 17636 corp: 10/118b lim: 13 exec/s: 25 rss: 72Mb 00:07:58.814 Done 51 runs in 2 second(s) 00:07:58.814 [2024-04-19 10:26:20.697992] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:07:59.073 10:26:20 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:07:59.073 10:26:20 -- ../common.sh@72 -- # (( i++ )) 00:07:59.073 10:26:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.073 10:26:20 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:59.073 10:26:20 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:07:59.073 10:26:20 -- vfio/run.sh@23 -- # local timen=1 00:07:59.073 10:26:20 -- vfio/run.sh@24 -- # local core=0x1 00:07:59.073 10:26:20 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:59.073 10:26:20 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:07:59.073 10:26:20 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:07:59.073 10:26:20 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:07:59.073 10:26:20 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:07:59.073 10:26:20 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:07:59.073 10:26:20 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:07:59.073 10:26:20 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:59.073 10:26:20 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:07:59.073 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:59.073 10:26:20 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:59.073 10:26:20 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:07:59.073 10:26:20 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:07:59.073 [2024-04-19 10:26:20.987041] Starting SPDK v24.05-pre git sha1 3381d6e5b / DPDK 23.11.0 initialization... 00:07:59.073 [2024-04-19 10:26:20.987111] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid211235 ] 00:07:59.073 EAL: No free 2048 kB hugepages reported on node 1 00:07:59.073 [2024-04-19 10:26:21.059454] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.073 [2024-04-19 10:26:21.135105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.332 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.332 INFO: Seed: 2477187440 00:07:59.332 INFO: Loaded 1 modules (345392 inline 8-bit counters): 345392 [0x286cf8c, 0x28c14bc), 00:07:59.332 INFO: Loaded 1 PC tables (345392 PCs): 345392 [0x28c14c0,0x2e067c0), 00:07:59.332 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:59.332 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.332 #2 INITED exec/s: 0 rss: 63Mb 00:07:59.332 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.332 This may also happen if the target rejected all inputs we tried so far 00:07:59.332 [2024-04-19 10:26:21.372383] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:07:59.332 [2024-04-19 10:26:21.419846] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:59.332 [2024-04-19 10:26:21.419907] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:59.849 NEW_FUNC[1/623]: 0x484cf0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:07:59.849 NEW_FUNC[2/623]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:59.849 #9 NEW cov: 10636 ft: 10745 corp: 2/10b lim: 9 exec/s: 0 rss: 69Mb L: 9/9 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:59.849 [2024-04-19 10:26:21.909875] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:59.849 [2024-04-19 10:26:21.909921] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.107 NEW_FUNC[1/13]: 0x13929a0 in handle_cmd_rsp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:2510 00:08:00.107 NEW_FUNC[2/13]: 0x1619990 in _nvme_ns_cmd_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:428 00:08:00.107 #10 NEW cov: 10794 ft: 14102 corp: 3/19b lim: 9 exec/s: 0 rss: 70Mb L: 9/9 MS: 1 ChangeByte- 00:08:00.107 [2024-04-19 10:26:22.100067] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.107 [2024-04-19 10:26:22.100100] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.107 NEW_FUNC[1/1]: 0x198a540 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:00.107 #21 NEW cov: 10811 ft: 14338 corp: 4/28b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CrossOver- 00:08:00.365 [2024-04-19 10:26:22.279293] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.365 [2024-04-19 10:26:22.279323] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.365 #22 NEW cov: 10811 ft: 14634 corp: 5/37b lim: 9 exec/s: 22 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:08:00.365 [2024-04-19 10:26:22.459983] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.365 [2024-04-19 10:26:22.460014] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.623 #23 NEW cov: 10811 ft: 14696 corp: 6/46b lim: 9 exec/s: 23 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:08:00.623 [2024-04-19 10:26:22.640147] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.623 [2024-04-19 10:26:22.640177] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.881 #24 NEW cov: 10811 ft: 15165 corp: 7/55b lim: 9 exec/s: 24 rss: 72Mb L: 9/9 MS: 1 CopyPart- 00:08:00.881 [2024-04-19 10:26:22.820442] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.881 [2024-04-19 10:26:22.820473] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.881 #30 NEW cov: 10811 ft: 15852 corp: 8/64b lim: 9 exec/s: 30 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:08:01.140 [2024-04-19 10:26:23.000950] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.140 [2024-04-19 10:26:23.000981] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.140 #36 NEW cov: 10811 ft: 16296 corp: 9/73b lim: 9 exec/s: 36 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:08:01.140 [2024-04-19 10:26:23.190971] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.140 [2024-04-19 10:26:23.191067] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.397 #37 NEW cov: 10818 ft: 16345 corp: 10/82b lim: 9 exec/s: 37 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:08:01.397 [2024-04-19 10:26:23.372327] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.397 [2024-04-19 10:26:23.372360] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.397 #38 NEW cov: 10818 ft: 16682 corp: 11/91b lim: 9 exec/s: 19 rss: 72Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:01.397 #38 DONE cov: 10818 ft: 16682 corp: 11/91b lim: 9 exec/s: 19 rss: 72Mb 00:08:01.397 Done 38 runs in 2 second(s) 00:08:01.397 [2024-04-19 10:26:23.501992] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:08:01.655 10:26:23 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:08:01.655 10:26:23 -- ../common.sh@72 -- # (( i++ )) 00:08:01.655 10:26:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.655 10:26:23 -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:08:01.655 00:08:01.655 real 0m19.609s 00:08:01.655 user 0m27.067s 00:08:01.655 sys 0m1.794s 00:08:01.655 10:26:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:01.655 10:26:23 -- common/autotest_common.sh@10 -- # set +x 00:08:01.655 ************************************ 00:08:01.655 END TEST vfio_fuzz 00:08:01.655 ************************************ 00:08:01.913 10:26:23 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:01.913 00:08:01.913 real 1m25.468s 00:08:01.913 user 2m7.829s 00:08:01.913 sys 0m10.307s 00:08:01.913 10:26:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:01.913 10:26:23 -- common/autotest_common.sh@10 -- # set +x 00:08:01.913 ************************************ 00:08:01.913 END TEST llvm_fuzz 00:08:01.913 ************************************ 00:08:01.913 10:26:23 -- spdk/autotest.sh@373 -- # [[ 0 -eq 1 ]] 00:08:01.913 10:26:23 -- spdk/autotest.sh@378 -- # trap - SIGINT SIGTERM EXIT 00:08:01.913 10:26:23 -- spdk/autotest.sh@380 -- # timing_enter post_cleanup 00:08:01.913 10:26:23 -- common/autotest_common.sh@710 -- # xtrace_disable 00:08:01.913 10:26:23 -- common/autotest_common.sh@10 -- # set +x 00:08:01.913 10:26:23 -- spdk/autotest.sh@381 -- # autotest_cleanup 00:08:01.913 10:26:23 -- common/autotest_common.sh@1378 -- # local autotest_es=0 00:08:01.913 10:26:23 -- common/autotest_common.sh@1379 -- # xtrace_disable 00:08:01.913 10:26:23 -- common/autotest_common.sh@10 -- # set +x 00:08:06.099 INFO: APP EXITING 00:08:06.099 INFO: killing all VMs 00:08:06.099 INFO: killing vhost app 00:08:06.099 INFO: EXIT DONE 00:08:09.388 Waiting for block devices as requested 00:08:09.388 0000:5e:00.0 (144d a80a): vfio-pci -> nvme 00:08:09.388 0000:af:00.0 (8086 2701): vfio-pci -> nvme 00:08:09.388 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:09.388 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:09.648 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:09.648 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:09.648 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:09.648 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:09.907 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:09.907 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:09.907 0000:b0:00.0 (8086 4140): vfio-pci -> nvme 00:08:10.166 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:10.166 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:10.166 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:10.425 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:10.425 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:10.425 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:10.684 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:10.684 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:13.975 Cleaning 00:08:13.975 Removing: /dev/shm/spdk_tgt_trace.pid182578 00:08:13.975 Removing: /var/run/dpdk/spdk_pid182075 00:08:13.975 Removing: /var/run/dpdk/spdk_pid182578 00:08:13.975 Removing: /var/run/dpdk/spdk_pid183168 00:08:13.975 Removing: /var/run/dpdk/spdk_pid183968 00:08:13.975 Removing: /var/run/dpdk/spdk_pid184184 00:08:13.975 Removing: /var/run/dpdk/spdk_pid184944 00:08:13.975 Removing: /var/run/dpdk/spdk_pid185052 00:08:13.975 Removing: /var/run/dpdk/spdk_pid185455 00:08:13.975 Removing: /var/run/dpdk/spdk_pid185680 00:08:13.975 Removing: /var/run/dpdk/spdk_pid185921 00:08:13.975 Removing: /var/run/dpdk/spdk_pid186178 00:08:13.975 Removing: /var/run/dpdk/spdk_pid186575 00:08:13.975 Removing: /var/run/dpdk/spdk_pid186778 00:08:13.975 Removing: /var/run/dpdk/spdk_pid186971 00:08:13.975 Removing: /var/run/dpdk/spdk_pid187193 00:08:13.975 Removing: /var/run/dpdk/spdk_pid187949 00:08:13.975 Removing: /var/run/dpdk/spdk_pid190255 00:08:13.975 Removing: /var/run/dpdk/spdk_pid190629 00:08:13.975 Removing: /var/run/dpdk/spdk_pid190933 00:08:13.975 Removing: /var/run/dpdk/spdk_pid191138 00:08:13.975 Removing: /var/run/dpdk/spdk_pid191712 00:08:13.975 Removing: /var/run/dpdk/spdk_pid192011 00:08:13.975 Removing: /var/run/dpdk/spdk_pid192576 00:08:13.975 Removing: /var/run/dpdk/spdk_pid192743 00:08:13.975 Removing: /var/run/dpdk/spdk_pid192955 00:08:13.975 Removing: /var/run/dpdk/spdk_pid193116 00:08:13.975 Removing: /var/run/dpdk/spdk_pid193259 00:08:13.975 Removing: /var/run/dpdk/spdk_pid193355 00:08:13.976 Removing: /var/run/dpdk/spdk_pid193801 00:08:13.976 Removing: /var/run/dpdk/spdk_pid193994 00:08:13.976 Removing: /var/run/dpdk/spdk_pid194195 00:08:13.976 Removing: /var/run/dpdk/spdk_pid194429 00:08:13.976 Removing: /var/run/dpdk/spdk_pid194648 00:08:13.976 Removing: /var/run/dpdk/spdk_pid194685 00:08:13.976 Removing: /var/run/dpdk/spdk_pid194923 00:08:13.976 Removing: /var/run/dpdk/spdk_pid195117 00:08:13.976 Removing: /var/run/dpdk/spdk_pid195316 00:08:13.976 Removing: /var/run/dpdk/spdk_pid195512 00:08:13.976 Removing: /var/run/dpdk/spdk_pid195815 00:08:13.976 Removing: /var/run/dpdk/spdk_pid196067 00:08:13.976 Removing: /var/run/dpdk/spdk_pid196260 00:08:13.976 Removing: /var/run/dpdk/spdk_pid196457 00:08:13.976 Removing: /var/run/dpdk/spdk_pid196656 00:08:13.976 Removing: /var/run/dpdk/spdk_pid196856 00:08:13.976 Removing: /var/run/dpdk/spdk_pid197049 00:08:13.976 Removing: /var/run/dpdk/spdk_pid197280 00:08:13.976 Removing: /var/run/dpdk/spdk_pid197551 00:08:13.976 Removing: /var/run/dpdk/spdk_pid197805 00:08:13.976 Removing: /var/run/dpdk/spdk_pid197997 00:08:13.976 Removing: /var/run/dpdk/spdk_pid198199 00:08:13.976 Removing: /var/run/dpdk/spdk_pid198394 00:08:13.976 Removing: /var/run/dpdk/spdk_pid198596 00:08:13.976 Removing: /var/run/dpdk/spdk_pid198795 00:08:13.976 Removing: /var/run/dpdk/spdk_pid199035 00:08:13.976 Removing: /var/run/dpdk/spdk_pid199330 00:08:13.976 Removing: /var/run/dpdk/spdk_pid199422 00:08:13.976 Removing: /var/run/dpdk/spdk_pid199750 00:08:13.976 Removing: /var/run/dpdk/spdk_pid200246 00:08:13.976 Removing: /var/run/dpdk/spdk_pid200596 00:08:13.976 Removing: /var/run/dpdk/spdk_pid200941 00:08:13.976 Removing: /var/run/dpdk/spdk_pid201294 00:08:13.976 Removing: /var/run/dpdk/spdk_pid201643 00:08:13.976 Removing: /var/run/dpdk/spdk_pid202000 00:08:13.976 Removing: /var/run/dpdk/spdk_pid202348 00:08:13.976 Removing: /var/run/dpdk/spdk_pid202701 00:08:13.976 Removing: /var/run/dpdk/spdk_pid203051 00:08:13.976 Removing: /var/run/dpdk/spdk_pid203420 00:08:13.976 Removing: /var/run/dpdk/spdk_pid203767 00:08:13.976 Removing: /var/run/dpdk/spdk_pid204120 00:08:13.976 Removing: /var/run/dpdk/spdk_pid204477 00:08:13.976 Removing: /var/run/dpdk/spdk_pid204814 00:08:13.976 Removing: /var/run/dpdk/spdk_pid205135 00:08:13.976 Removing: /var/run/dpdk/spdk_pid205455 00:08:13.976 Removing: /var/run/dpdk/spdk_pid205782 00:08:13.976 Removing: /var/run/dpdk/spdk_pid206122 00:08:13.976 Removing: /var/run/dpdk/spdk_pid206483 00:08:13.976 Removing: /var/run/dpdk/spdk_pid206842 00:08:13.976 Removing: /var/run/dpdk/spdk_pid207204 00:08:13.976 Removing: /var/run/dpdk/spdk_pid207552 00:08:13.976 Removing: /var/run/dpdk/spdk_pid207880 00:08:13.976 Removing: /var/run/dpdk/spdk_pid208179 00:08:13.976 Removing: /var/run/dpdk/spdk_pid208530 00:08:13.976 Removing: /var/run/dpdk/spdk_pid209079 00:08:13.976 Removing: /var/run/dpdk/spdk_pid209454 00:08:13.976 Removing: /var/run/dpdk/spdk_pid209820 00:08:13.976 Removing: /var/run/dpdk/spdk_pid210174 00:08:13.976 Removing: /var/run/dpdk/spdk_pid210522 00:08:13.976 Removing: /var/run/dpdk/spdk_pid210880 00:08:13.976 Removing: /var/run/dpdk/spdk_pid211235 00:08:13.976 Clean 00:08:13.976 10:26:35 -- common/autotest_common.sh@1437 -- # return 0 00:08:13.976 10:26:35 -- spdk/autotest.sh@382 -- # timing_exit post_cleanup 00:08:13.976 10:26:35 -- common/autotest_common.sh@716 -- # xtrace_disable 00:08:13.976 10:26:35 -- common/autotest_common.sh@10 -- # set +x 00:08:13.976 10:26:35 -- spdk/autotest.sh@384 -- # timing_exit autotest 00:08:13.976 10:26:35 -- common/autotest_common.sh@716 -- # xtrace_disable 00:08:13.976 10:26:35 -- common/autotest_common.sh@10 -- # set +x 00:08:13.976 10:26:35 -- spdk/autotest.sh@385 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:13.976 10:26:35 -- spdk/autotest.sh@387 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:13.976 10:26:35 -- spdk/autotest.sh@387 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:13.976 10:26:35 -- spdk/autotest.sh@389 -- # hash lcov 00:08:13.976 10:26:35 -- spdk/autotest.sh@389 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:08:13.976 10:26:35 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:13.976 10:26:35 -- scripts/common.sh@502 -- $ [[ -e /bin/wpdk_common.sh ]] 00:08:13.976 10:26:35 -- scripts/common.sh@510 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:13.976 10:26:35 -- scripts/common.sh@511 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:13.976 10:26:35 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.976 10:26:35 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.976 10:26:35 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.976 10:26:35 -- paths/export.sh@5 -- $ export PATH 00:08:13.976 10:26:35 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.976 10:26:35 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:08:13.976 10:26:35 -- common/autobuild_common.sh@435 -- $ date +%s 00:08:13.976 10:26:35 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1713515195.XXXXXX 00:08:13.976 10:26:35 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1713515195.CSq69F 00:08:13.976 10:26:35 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:08:13.976 10:26:35 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:08:13.976 10:26:35 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:08:13.976 10:26:35 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:08:13.976 10:26:35 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:08:13.976 10:26:35 -- common/autobuild_common.sh@451 -- $ get_config_params 00:08:13.976 10:26:35 -- common/autotest_common.sh@385 -- $ xtrace_disable 00:08:13.976 10:26:35 -- common/autotest_common.sh@10 -- $ set +x 00:08:13.976 10:26:35 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:08:13.976 10:26:35 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:08:13.976 10:26:35 -- pm/common@17 -- $ local monitor 00:08:13.976 10:26:35 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:08:13.976 10:26:35 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=216587 00:08:13.976 10:26:35 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:08:13.976 10:26:35 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=216588 00:08:13.976 10:26:35 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:08:13.976 10:26:35 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=216590 00:08:13.976 10:26:35 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:08:13.976 10:26:35 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=216593 00:08:13.976 10:26:35 -- pm/common@21 -- $ date +%s 00:08:13.976 10:26:35 -- pm/common@26 -- $ sleep 1 00:08:13.976 10:26:35 -- pm/common@21 -- $ date +%s 00:08:13.976 10:26:35 -- pm/common@21 -- $ date +%s 00:08:13.976 10:26:35 -- pm/common@21 -- $ date +%s 00:08:13.976 10:26:35 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713515195 00:08:13.976 10:26:35 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713515195 00:08:13.976 10:26:35 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713515195 00:08:13.976 10:26:35 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713515195 00:08:13.976 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713515195_collect-vmstat.pm.log 00:08:13.976 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713515195_collect-cpu-temp.pm.log 00:08:13.977 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713515195_collect-cpu-load.pm.log 00:08:13.977 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713515195_collect-bmc-pm.bmc.pm.log 00:08:14.913 10:26:36 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:08:14.913 10:26:36 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:08:14.913 10:26:36 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:14.913 10:26:36 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:08:14.913 10:26:36 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:08:14.913 10:26:36 -- spdk/autopackage.sh@19 -- $ timing_finish 00:08:14.913 10:26:36 -- common/autotest_common.sh@722 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:08:14.913 10:26:36 -- common/autotest_common.sh@723 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:08:14.913 10:26:37 -- common/autotest_common.sh@725 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:15.172 10:26:37 -- spdk/autopackage.sh@20 -- $ exit 0 00:08:15.172 10:26:37 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:08:15.172 10:26:37 -- pm/common@30 -- $ signal_monitor_resources TERM 00:08:15.172 10:26:37 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:08:15.172 10:26:37 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:08:15.172 10:26:37 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:08:15.172 10:26:37 -- pm/common@45 -- $ pid=216607 00:08:15.172 10:26:37 -- pm/common@52 -- $ sudo kill -TERM 216607 00:08:15.172 10:26:37 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:08:15.172 10:26:37 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:08:15.172 10:26:37 -- pm/common@45 -- $ pid=216601 00:08:15.172 10:26:37 -- pm/common@52 -- $ sudo kill -TERM 216601 00:08:15.172 10:26:37 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:08:15.172 10:26:37 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:08:15.172 10:26:37 -- pm/common@45 -- $ pid=216602 00:08:15.172 10:26:37 -- pm/common@52 -- $ sudo kill -TERM 216602 00:08:15.172 10:26:37 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:08:15.172 10:26:37 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:08:15.172 10:26:37 -- pm/common@45 -- $ pid=216608 00:08:15.172 10:26:37 -- pm/common@52 -- $ sudo kill -TERM 216608 00:08:15.172 + [[ -n 83366 ]] 00:08:15.172 + sudo kill 83366 00:08:15.185 [Pipeline] } 00:08:15.204 [Pipeline] // stage 00:08:15.210 [Pipeline] } 00:08:15.231 [Pipeline] // timeout 00:08:15.238 [Pipeline] } 00:08:15.256 [Pipeline] // catchError 00:08:15.263 [Pipeline] } 00:08:15.281 [Pipeline] // wrap 00:08:15.298 [Pipeline] } 00:08:15.315 [Pipeline] // catchError 00:08:15.325 [Pipeline] stage 00:08:15.327 [Pipeline] { (Epilogue) 00:08:15.342 [Pipeline] catchError 00:08:15.344 [Pipeline] { 00:08:15.359 [Pipeline] echo 00:08:15.360 Cleanup processes 00:08:15.367 [Pipeline] sh 00:08:15.659 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:15.659 216700 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:08:15.659 216927 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:15.674 [Pipeline] sh 00:08:15.963 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:15.963 ++ grep -v 'sudo pgrep' 00:08:15.963 ++ awk '{print $1}' 00:08:15.963 + sudo kill -9 216700 00:08:15.976 [Pipeline] sh 00:08:16.267 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:08:17.220 [Pipeline] sh 00:08:17.508 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:08:17.508 Artifacts sizes are good 00:08:17.523 [Pipeline] archiveArtifacts 00:08:17.531 Archiving artifacts 00:08:17.991 [Pipeline] sh 00:08:18.277 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:08:18.292 [Pipeline] cleanWs 00:08:18.301 [WS-CLEANUP] Deleting project workspace... 00:08:18.301 [WS-CLEANUP] Deferred wipeout is used... 00:08:18.308 [WS-CLEANUP] done 00:08:18.310 [Pipeline] } 00:08:18.329 [Pipeline] // catchError 00:08:18.339 [Pipeline] sh 00:08:18.638 + logger -p user.info -t JENKINS-CI 00:08:18.649 [Pipeline] } 00:08:18.664 [Pipeline] // stage 00:08:18.669 [Pipeline] } 00:08:18.686 [Pipeline] // node 00:08:18.691 [Pipeline] End of Pipeline 00:08:18.734 Finished: SUCCESS