00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-v23.11" build number 987 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3649 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.055 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.056 The recommended git tool is: git 00:00:00.056 using credential 00000000-0000-0000-0000-000000000002 00:00:00.058 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.074 Fetching changes from the remote Git repository 00:00:00.075 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.094 Using shallow fetch with depth 1 00:00:00.094 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.094 > git --version # timeout=10 00:00:00.111 > git --version # 'git version 2.39.2' 00:00:00.111 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.131 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.131 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.785 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.798 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.810 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:03.810 > git config core.sparsecheckout # timeout=10 00:00:03.821 > git read-tree -mu HEAD # timeout=10 00:00:03.836 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:03.855 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:03.855 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:03.966 [Pipeline] Start of Pipeline 00:00:03.980 [Pipeline] library 00:00:03.982 Loading library shm_lib@master 00:00:03.982 Library shm_lib@master is cached. Copying from home. 00:00:04.002 [Pipeline] node 00:00:04.031 Running on WFP39 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:04.033 [Pipeline] { 00:00:04.043 [Pipeline] catchError 00:00:04.045 [Pipeline] { 00:00:04.058 [Pipeline] wrap 00:00:04.067 [Pipeline] { 00:00:04.076 [Pipeline] stage 00:00:04.078 [Pipeline] { (Prologue) 00:00:04.286 [Pipeline] sh 00:00:04.570 + logger -p user.info -t JENKINS-CI 00:00:04.589 [Pipeline] echo 00:00:04.591 Node: WFP39 00:00:04.600 [Pipeline] sh 00:00:04.899 [Pipeline] setCustomBuildProperty 00:00:04.909 [Pipeline] echo 00:00:04.910 Cleanup processes 00:00:04.915 [Pipeline] sh 00:00:05.196 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.196 1332802 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.210 [Pipeline] sh 00:00:05.495 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.495 ++ grep -v 'sudo pgrep' 00:00:05.495 ++ awk '{print $1}' 00:00:05.495 + sudo kill -9 00:00:05.495 + true 00:00:05.508 [Pipeline] cleanWs 00:00:05.516 [WS-CLEANUP] Deleting project workspace... 00:00:05.516 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.521 [WS-CLEANUP] done 00:00:05.524 [Pipeline] setCustomBuildProperty 00:00:05.533 [Pipeline] sh 00:00:05.811 + sudo git config --global --replace-all safe.directory '*' 00:00:05.895 [Pipeline] httpRequest 00:00:06.239 [Pipeline] echo 00:00:06.240 Sorcerer 10.211.164.20 is alive 00:00:06.246 [Pipeline] retry 00:00:06.247 [Pipeline] { 00:00:06.256 [Pipeline] httpRequest 00:00:06.259 HttpMethod: GET 00:00:06.260 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.260 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.262 Response Code: HTTP/1.1 200 OK 00:00:06.263 Success: Status code 200 is in the accepted range: 200,404 00:00:06.263 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.735 [Pipeline] } 00:00:06.750 [Pipeline] // retry 00:00:06.755 [Pipeline] sh 00:00:07.034 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:07.050 [Pipeline] httpRequest 00:00:07.719 [Pipeline] echo 00:00:07.721 Sorcerer 10.211.164.20 is alive 00:00:07.729 [Pipeline] retry 00:00:07.731 [Pipeline] { 00:00:07.743 [Pipeline] httpRequest 00:00:07.750 HttpMethod: GET 00:00:07.751 URL: http://10.211.164.20/packages/spdk_557f022f641abf567fb02704f67857eb8f6d9ff3.tar.gz 00:00:07.751 Sending request to url: http://10.211.164.20/packages/spdk_557f022f641abf567fb02704f67857eb8f6d9ff3.tar.gz 00:00:07.751 Response Code: HTTP/1.1 200 OK 00:00:07.752 Success: Status code 200 is in the accepted range: 200,404 00:00:07.752 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_557f022f641abf567fb02704f67857eb8f6d9ff3.tar.gz 00:00:30.478 [Pipeline] } 00:00:30.495 [Pipeline] // retry 00:00:30.502 [Pipeline] sh 00:00:30.785 + tar --no-same-owner -xf spdk_557f022f641abf567fb02704f67857eb8f6d9ff3.tar.gz 00:00:33.613 [Pipeline] sh 00:00:33.944 + git -C spdk log --oneline -n5 00:00:33.944 557f022f6 bdev: Change 1st parameter of bdev_bytes_to_blocks from bdev to desc 00:00:33.944 c0b2ac5c9 bdev: Change void to bdev_io pointer of parameter of _bdev_io_submit() 00:00:33.944 92fb22519 dif: dif_generate/verify_copy() supports NVMe PRACT = 1 and MD size > PI size 00:00:33.944 79daf868a dif: Add SPDK_DIF_FLAGS_NVME_PRACT for dif_generate/verify_copy() 00:00:33.944 431baf1b5 dif: Insert abstraction into dif_generate/verify_copy() for NVMe PRACT 00:00:33.970 [Pipeline] withCredentials 00:00:33.995 > git --version # timeout=10 00:00:34.009 > git --version # 'git version 2.39.2' 00:00:34.030 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:34.032 [Pipeline] { 00:00:34.041 [Pipeline] retry 00:00:34.043 [Pipeline] { 00:00:34.057 [Pipeline] sh 00:00:34.348 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:00:35.880 [Pipeline] } 00:00:35.898 [Pipeline] // retry 00:00:35.904 [Pipeline] } 00:00:35.923 [Pipeline] // withCredentials 00:00:35.932 [Pipeline] httpRequest 00:00:36.282 [Pipeline] echo 00:00:36.284 Sorcerer 10.211.164.20 is alive 00:00:36.295 [Pipeline] retry 00:00:36.298 [Pipeline] { 00:00:36.314 [Pipeline] httpRequest 00:00:36.319 HttpMethod: GET 00:00:36.320 URL: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:36.321 Sending request to url: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:36.329 Response Code: HTTP/1.1 200 OK 00:00:36.330 Success: Status code 200 is in the accepted range: 200,404 00:00:36.330 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:34.976 [Pipeline] } 00:01:34.993 [Pipeline] // retry 00:01:35.002 [Pipeline] sh 00:01:35.375 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:36.787 [Pipeline] sh 00:01:37.073 + git -C dpdk log --oneline -n5 00:01:37.073 eeb0605f11 version: 23.11.0 00:01:37.073 238778122a doc: update release notes for 23.11 00:01:37.073 46aa6b3cfc doc: fix description of RSS features 00:01:37.073 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:37.073 7e421ae345 devtools: support skipping forbid rule check 00:01:37.083 [Pipeline] } 00:01:37.098 [Pipeline] // stage 00:01:37.109 [Pipeline] stage 00:01:37.112 [Pipeline] { (Prepare) 00:01:37.136 [Pipeline] writeFile 00:01:37.154 [Pipeline] sh 00:01:37.440 + logger -p user.info -t JENKINS-CI 00:01:37.453 [Pipeline] sh 00:01:37.739 + logger -p user.info -t JENKINS-CI 00:01:37.751 [Pipeline] sh 00:01:38.037 + cat autorun-spdk.conf 00:01:38.037 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:38.037 SPDK_TEST_FUZZER_SHORT=1 00:01:38.037 SPDK_TEST_FUZZER=1 00:01:38.037 SPDK_TEST_SETUP=1 00:01:38.037 SPDK_RUN_UBSAN=1 00:01:38.037 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:38.037 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:38.045 RUN_NIGHTLY=1 00:01:38.049 [Pipeline] readFile 00:01:38.072 [Pipeline] withEnv 00:01:38.074 [Pipeline] { 00:01:38.085 [Pipeline] sh 00:01:38.371 + set -ex 00:01:38.371 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:38.371 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:38.371 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:38.371 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:38.371 ++ SPDK_TEST_FUZZER=1 00:01:38.371 ++ SPDK_TEST_SETUP=1 00:01:38.371 ++ SPDK_RUN_UBSAN=1 00:01:38.371 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:38.371 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:38.371 ++ RUN_NIGHTLY=1 00:01:38.371 + case $SPDK_TEST_NVMF_NICS in 00:01:38.371 + DRIVERS= 00:01:38.371 + [[ -n '' ]] 00:01:38.371 + exit 0 00:01:38.381 [Pipeline] } 00:01:38.400 [Pipeline] // withEnv 00:01:38.405 [Pipeline] } 00:01:38.418 [Pipeline] // stage 00:01:38.426 [Pipeline] catchError 00:01:38.427 [Pipeline] { 00:01:38.442 [Pipeline] timeout 00:01:38.443 Timeout set to expire in 30 min 00:01:38.445 [Pipeline] { 00:01:38.460 [Pipeline] stage 00:01:38.462 [Pipeline] { (Tests) 00:01:38.477 [Pipeline] sh 00:01:38.766 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:38.766 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:38.766 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:38.766 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:38.766 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:38.766 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:38.766 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:38.766 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:38.766 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:38.766 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:38.766 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:38.766 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:38.766 + source /etc/os-release 00:01:38.766 ++ NAME='Fedora Linux' 00:01:38.766 ++ VERSION='39 (Cloud Edition)' 00:01:38.766 ++ ID=fedora 00:01:38.766 ++ VERSION_ID=39 00:01:38.766 ++ VERSION_CODENAME= 00:01:38.766 ++ PLATFORM_ID=platform:f39 00:01:38.766 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:38.766 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:38.766 ++ LOGO=fedora-logo-icon 00:01:38.766 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:38.766 ++ HOME_URL=https://fedoraproject.org/ 00:01:38.766 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:38.766 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:38.766 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:38.766 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:38.766 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:38.766 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:38.766 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:38.766 ++ SUPPORT_END=2024-11-12 00:01:38.766 ++ VARIANT='Cloud Edition' 00:01:38.766 ++ VARIANT_ID=cloud 00:01:38.766 + uname -a 00:01:38.766 Linux spdk-wfp-39 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 05:41:37 UTC 2024 x86_64 GNU/Linux 00:01:38.766 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:42.062 Hugepages 00:01:42.062 node hugesize free / total 00:01:42.062 node0 1048576kB 0 / 0 00:01:42.062 node0 2048kB 0 / 0 00:01:42.062 node1 1048576kB 0 / 0 00:01:42.062 node1 2048kB 0 / 0 00:01:42.062 00:01:42.062 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:42.062 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:42.062 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:42.062 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:42.062 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:42.062 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:42.062 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:42.062 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:42.062 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:42.062 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:42.062 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:42.062 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:42.062 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:42.062 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:42.062 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:42.062 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:42.062 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:42.062 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:42.062 + rm -f /tmp/spdk-ld-path 00:01:42.062 + source autorun-spdk.conf 00:01:42.062 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:42.062 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:42.062 ++ SPDK_TEST_FUZZER=1 00:01:42.062 ++ SPDK_TEST_SETUP=1 00:01:42.062 ++ SPDK_RUN_UBSAN=1 00:01:42.062 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:42.062 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.062 ++ RUN_NIGHTLY=1 00:01:42.062 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:42.062 + [[ -n '' ]] 00:01:42.062 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:42.062 + for M in /var/spdk/build-*-manifest.txt 00:01:42.062 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:42.062 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:42.062 + for M in /var/spdk/build-*-manifest.txt 00:01:42.062 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:42.062 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:42.062 + for M in /var/spdk/build-*-manifest.txt 00:01:42.062 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:42.062 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:42.062 ++ uname 00:01:42.062 + [[ Linux == \L\i\n\u\x ]] 00:01:42.062 + sudo dmesg -T 00:01:42.322 + sudo dmesg --clear 00:01:42.322 + dmesg_pid=1334433 00:01:42.322 + [[ Fedora Linux == FreeBSD ]] 00:01:42.322 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:42.322 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:42.322 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:42.322 + [[ -x /usr/src/fio-static/fio ]] 00:01:42.322 + export FIO_BIN=/usr/src/fio-static/fio 00:01:42.322 + FIO_BIN=/usr/src/fio-static/fio 00:01:42.322 + sudo dmesg -Tw 00:01:42.322 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:42.322 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:42.322 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:42.322 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:42.322 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:42.322 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:42.322 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:42.322 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:42.322 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:42.322 15:03:20 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:01:42.322 15:03:20 -- spdk/autorun.sh@20 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:42.322 15:03:20 -- short-fuzz-phy-autotest/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:42.322 15:03:20 -- short-fuzz-phy-autotest/autorun-spdk.conf@2 -- $ SPDK_TEST_FUZZER_SHORT=1 00:01:42.322 15:03:20 -- short-fuzz-phy-autotest/autorun-spdk.conf@3 -- $ SPDK_TEST_FUZZER=1 00:01:42.322 15:03:20 -- short-fuzz-phy-autotest/autorun-spdk.conf@4 -- $ SPDK_TEST_SETUP=1 00:01:42.322 15:03:20 -- short-fuzz-phy-autotest/autorun-spdk.conf@5 -- $ SPDK_RUN_UBSAN=1 00:01:42.322 15:03:20 -- short-fuzz-phy-autotest/autorun-spdk.conf@6 -- $ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:42.322 15:03:20 -- short-fuzz-phy-autotest/autorun-spdk.conf@7 -- $ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.322 15:03:20 -- short-fuzz-phy-autotest/autorun-spdk.conf@8 -- $ RUN_NIGHTLY=1 00:01:42.322 15:03:20 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:01:42.322 15:03:20 -- spdk/autorun.sh@25 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autobuild.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:42.322 15:03:20 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:01:42.322 15:03:20 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:42.322 15:03:20 -- scripts/common.sh@15 -- $ shopt -s extglob 00:01:42.322 15:03:20 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:42.322 15:03:20 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:42.322 15:03:20 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:42.322 15:03:20 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:42.322 15:03:20 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:42.322 15:03:20 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:42.322 15:03:20 -- paths/export.sh@5 -- $ export PATH 00:01:42.322 15:03:20 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:42.322 15:03:20 -- common/autobuild_common.sh@492 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:42.322 15:03:20 -- common/autobuild_common.sh@493 -- $ date +%s 00:01:42.322 15:03:20 -- common/autobuild_common.sh@493 -- $ mktemp -dt spdk_1732111400.XXXXXX 00:01:42.322 15:03:20 -- common/autobuild_common.sh@493 -- $ SPDK_WORKSPACE=/tmp/spdk_1732111400.kAmbh6 00:01:42.322 15:03:20 -- common/autobuild_common.sh@495 -- $ [[ -n '' ]] 00:01:42.322 15:03:20 -- common/autobuild_common.sh@499 -- $ '[' -n v23.11 ']' 00:01:42.322 15:03:20 -- common/autobuild_common.sh@500 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.322 15:03:20 -- common/autobuild_common.sh@500 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:42.322 15:03:20 -- common/autobuild_common.sh@506 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:42.322 15:03:20 -- common/autobuild_common.sh@508 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:42.322 15:03:20 -- common/autobuild_common.sh@509 -- $ get_config_params 00:01:42.322 15:03:20 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:01:42.323 15:03:20 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.323 15:03:20 -- common/autobuild_common.sh@509 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:42.323 15:03:20 -- common/autobuild_common.sh@511 -- $ start_monitor_resources 00:01:42.323 15:03:20 -- pm/common@17 -- $ local monitor 00:01:42.323 15:03:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:42.323 15:03:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:42.323 15:03:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:42.323 15:03:20 -- pm/common@21 -- $ date +%s 00:01:42.323 15:03:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:42.323 15:03:20 -- pm/common@21 -- $ date +%s 00:01:42.323 15:03:20 -- pm/common@25 -- $ sleep 1 00:01:42.323 15:03:20 -- pm/common@21 -- $ date +%s 00:01:42.323 15:03:20 -- pm/common@21 -- $ date +%s 00:01:42.323 15:03:21 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732111401 00:01:42.323 15:03:21 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732111401 00:01:42.323 15:03:21 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732111401 00:01:42.582 15:03:21 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732111401 00:01:42.582 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732111401_collect-vmstat.pm.log 00:01:42.582 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732111401_collect-cpu-load.pm.log 00:01:42.582 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732111401_collect-cpu-temp.pm.log 00:01:42.582 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732111401_collect-bmc-pm.bmc.pm.log 00:01:43.525 15:03:22 -- common/autobuild_common.sh@512 -- $ trap stop_monitor_resources EXIT 00:01:43.525 15:03:22 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:43.525 15:03:22 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:43.525 15:03:22 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:43.525 15:03:22 -- spdk/autobuild.sh@16 -- $ date -u 00:01:43.525 Wed Nov 20 02:03:22 PM UTC 2024 00:01:43.525 15:03:22 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:43.525 v25.01-pre-219-g557f022f6 00:01:43.525 15:03:22 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:43.525 15:03:22 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:43.525 15:03:22 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:43.525 15:03:22 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:01:43.525 15:03:22 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:43.525 15:03:22 -- common/autotest_common.sh@10 -- $ set +x 00:01:43.525 ************************************ 00:01:43.525 START TEST ubsan 00:01:43.525 ************************************ 00:01:43.525 15:03:22 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:01:43.525 using ubsan 00:01:43.525 00:01:43.525 real 0m0.000s 00:01:43.525 user 0m0.000s 00:01:43.525 sys 0m0.000s 00:01:43.525 15:03:22 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:01:43.525 15:03:22 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:43.525 ************************************ 00:01:43.525 END TEST ubsan 00:01:43.525 ************************************ 00:01:43.525 15:03:22 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:43.525 15:03:22 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:43.525 15:03:22 -- common/autobuild_common.sh@449 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:43.525 15:03:22 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:01:43.525 15:03:22 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:43.525 15:03:22 -- common/autotest_common.sh@10 -- $ set +x 00:01:43.525 ************************************ 00:01:43.525 START TEST build_native_dpdk 00:01:43.525 ************************************ 00:01:43.525 15:03:22 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:43.525 15:03:22 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:43.525 eeb0605f11 version: 23.11.0 00:01:43.525 238778122a doc: update release notes for 23.11 00:01:43.525 46aa6b3cfc doc: fix description of RSS features 00:01:43.525 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:43.525 7e421ae345 devtools: support skipping forbid rule check 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@102 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base" "power/acpi" "power/amd_pstate" "power/cppc" "power/intel_pstate" "power/intel_uncore" "power/kvm_vm") 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@103 -- $ local mlx5_libs_added=n 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@146 -- $ [[ 0 -eq 1 ]] 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@174 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@175 -- $ uname -s 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@175 -- $ '[' Linux = Linux ']' 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 23.11.0 21.11.0 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@180 -- $ patch -p1 00:01:43.526 patching file config/rte_config.h 00:01:43.526 Hunk #1 succeeded at 60 (offset 1 line). 00:01:43.526 15:03:22 build_native_dpdk -- common/autobuild_common.sh@183 -- $ lt 23.11.0 24.07.0 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:43.526 15:03:22 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:01:43.787 15:03:22 build_native_dpdk -- common/autobuild_common.sh@184 -- $ patch -p1 00:01:43.787 patching file lib/pcapng/rte_pcapng.c 00:01:43.787 15:03:22 build_native_dpdk -- common/autobuild_common.sh@186 -- $ ge 23.11.0 24.07.0 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 23.11.0 '>=' 24.07.0 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:43.787 15:03:22 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:01:43.787 15:03:22 build_native_dpdk -- common/autobuild_common.sh@190 -- $ dpdk_kmods=false 00:01:43.787 15:03:22 build_native_dpdk -- common/autobuild_common.sh@191 -- $ uname -s 00:01:43.787 15:03:22 build_native_dpdk -- common/autobuild_common.sh@191 -- $ '[' Linux = FreeBSD ']' 00:01:43.787 15:03:22 build_native_dpdk -- common/autobuild_common.sh@195 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base power/acpi power/amd_pstate power/cppc power/intel_pstate power/intel_uncore power/kvm_vm 00:01:43.787 15:03:22 build_native_dpdk -- common/autobuild_common.sh@195 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:01:49.069 The Meson build system 00:01:49.069 Version: 1.5.0 00:01:49.069 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:49.069 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:49.069 Build type: native build 00:01:49.069 Program cat found: YES (/usr/bin/cat) 00:01:49.069 Project name: DPDK 00:01:49.069 Project version: 23.11.0 00:01:49.069 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:49.069 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:49.069 Host machine cpu family: x86_64 00:01:49.069 Host machine cpu: x86_64 00:01:49.069 Message: ## Building in Developer Mode ## 00:01:49.069 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:49.069 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:49.069 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:49.069 Program python3 found: YES (/usr/bin/python3) 00:01:49.069 Program cat found: YES (/usr/bin/cat) 00:01:49.069 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:49.069 Compiler for C supports arguments -march=native: YES 00:01:49.069 Checking for size of "void *" : 8 00:01:49.069 Checking for size of "void *" : 8 (cached) 00:01:49.069 Library m found: YES 00:01:49.069 Library numa found: YES 00:01:49.069 Has header "numaif.h" : YES 00:01:49.069 Library fdt found: NO 00:01:49.069 Library execinfo found: NO 00:01:49.069 Has header "execinfo.h" : YES 00:01:49.070 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:49.070 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:49.070 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:49.070 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:49.070 Run-time dependency openssl found: YES 3.1.1 00:01:49.070 Run-time dependency libpcap found: YES 1.10.4 00:01:49.070 Has header "pcap.h" with dependency libpcap: YES 00:01:49.070 Compiler for C supports arguments -Wcast-qual: YES 00:01:49.070 Compiler for C supports arguments -Wdeprecated: YES 00:01:49.070 Compiler for C supports arguments -Wformat: YES 00:01:49.070 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:49.070 Compiler for C supports arguments -Wformat-security: NO 00:01:49.070 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:49.070 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:49.070 Compiler for C supports arguments -Wnested-externs: YES 00:01:49.070 Compiler for C supports arguments -Wold-style-definition: YES 00:01:49.070 Compiler for C supports arguments -Wpointer-arith: YES 00:01:49.070 Compiler for C supports arguments -Wsign-compare: YES 00:01:49.070 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:49.070 Compiler for C supports arguments -Wundef: YES 00:01:49.070 Compiler for C supports arguments -Wwrite-strings: YES 00:01:49.070 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:49.070 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:49.070 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:49.070 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:49.070 Program objdump found: YES (/usr/bin/objdump) 00:01:49.070 Compiler for C supports arguments -mavx512f: YES 00:01:49.070 Checking if "AVX512 checking" compiles: YES 00:01:49.070 Fetching value of define "__SSE4_2__" : 1 00:01:49.070 Fetching value of define "__AES__" : 1 00:01:49.070 Fetching value of define "__AVX__" : 1 00:01:49.070 Fetching value of define "__AVX2__" : 1 00:01:49.070 Fetching value of define "__AVX512BW__" : 1 00:01:49.070 Fetching value of define "__AVX512CD__" : 1 00:01:49.070 Fetching value of define "__AVX512DQ__" : 1 00:01:49.070 Fetching value of define "__AVX512F__" : 1 00:01:49.070 Fetching value of define "__AVX512VL__" : 1 00:01:49.070 Fetching value of define "__PCLMUL__" : 1 00:01:49.070 Fetching value of define "__RDRND__" : 1 00:01:49.070 Fetching value of define "__RDSEED__" : 1 00:01:49.070 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:49.070 Fetching value of define "__znver1__" : (undefined) 00:01:49.070 Fetching value of define "__znver2__" : (undefined) 00:01:49.070 Fetching value of define "__znver3__" : (undefined) 00:01:49.070 Fetching value of define "__znver4__" : (undefined) 00:01:49.070 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:49.070 Message: lib/log: Defining dependency "log" 00:01:49.070 Message: lib/kvargs: Defining dependency "kvargs" 00:01:49.070 Message: lib/telemetry: Defining dependency "telemetry" 00:01:49.070 Checking for function "getentropy" : NO 00:01:49.070 Message: lib/eal: Defining dependency "eal" 00:01:49.070 Message: lib/ring: Defining dependency "ring" 00:01:49.070 Message: lib/rcu: Defining dependency "rcu" 00:01:49.070 Message: lib/mempool: Defining dependency "mempool" 00:01:49.070 Message: lib/mbuf: Defining dependency "mbuf" 00:01:49.070 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:49.070 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:49.070 Compiler for C supports arguments -mpclmul: YES 00:01:49.070 Compiler for C supports arguments -maes: YES 00:01:49.070 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:49.070 Compiler for C supports arguments -mavx512bw: YES 00:01:49.070 Compiler for C supports arguments -mavx512dq: YES 00:01:49.070 Compiler for C supports arguments -mavx512vl: YES 00:01:49.070 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:49.070 Compiler for C supports arguments -mavx2: YES 00:01:49.070 Compiler for C supports arguments -mavx: YES 00:01:49.070 Message: lib/net: Defining dependency "net" 00:01:49.070 Message: lib/meter: Defining dependency "meter" 00:01:49.070 Message: lib/ethdev: Defining dependency "ethdev" 00:01:49.070 Message: lib/pci: Defining dependency "pci" 00:01:49.070 Message: lib/cmdline: Defining dependency "cmdline" 00:01:49.070 Message: lib/metrics: Defining dependency "metrics" 00:01:49.070 Message: lib/hash: Defining dependency "hash" 00:01:49.070 Message: lib/timer: Defining dependency "timer" 00:01:49.070 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:49.070 Message: lib/acl: Defining dependency "acl" 00:01:49.070 Message: lib/bbdev: Defining dependency "bbdev" 00:01:49.070 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:49.070 Run-time dependency libelf found: YES 0.191 00:01:49.070 Message: lib/bpf: Defining dependency "bpf" 00:01:49.070 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:49.070 Message: lib/compressdev: Defining dependency "compressdev" 00:01:49.070 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:49.070 Message: lib/distributor: Defining dependency "distributor" 00:01:49.070 Message: lib/dmadev: Defining dependency "dmadev" 00:01:49.070 Message: lib/efd: Defining dependency "efd" 00:01:49.070 Message: lib/eventdev: Defining dependency "eventdev" 00:01:49.070 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:49.070 Message: lib/gpudev: Defining dependency "gpudev" 00:01:49.070 Message: lib/gro: Defining dependency "gro" 00:01:49.070 Message: lib/gso: Defining dependency "gso" 00:01:49.070 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:49.070 Message: lib/jobstats: Defining dependency "jobstats" 00:01:49.070 Message: lib/latencystats: Defining dependency "latencystats" 00:01:49.070 Message: lib/lpm: Defining dependency "lpm" 00:01:49.070 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:49.070 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:49.070 Message: lib/member: Defining dependency "member" 00:01:49.070 Message: lib/pcapng: Defining dependency "pcapng" 00:01:49.070 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:49.070 Message: lib/power: Defining dependency "power" 00:01:49.070 Message: lib/rawdev: Defining dependency "rawdev" 00:01:49.070 Message: lib/regexdev: Defining dependency "regexdev" 00:01:49.070 Message: lib/mldev: Defining dependency "mldev" 00:01:49.070 Message: lib/rib: Defining dependency "rib" 00:01:49.070 Message: lib/reorder: Defining dependency "reorder" 00:01:49.070 Message: lib/sched: Defining dependency "sched" 00:01:49.070 Message: lib/security: Defining dependency "security" 00:01:49.070 Message: lib/stack: Defining dependency "stack" 00:01:49.070 Has header "linux/userfaultfd.h" : YES 00:01:49.070 Has header "linux/vduse.h" : YES 00:01:49.070 Message: lib/vhost: Defining dependency "vhost" 00:01:49.070 Message: lib/ipsec: Defining dependency "ipsec" 00:01:49.070 Message: lib/pdcp: Defining dependency "pdcp" 00:01:49.070 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:49.070 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:49.070 Message: lib/fib: Defining dependency "fib" 00:01:49.070 Message: lib/port: Defining dependency "port" 00:01:49.070 Message: lib/pdump: Defining dependency "pdump" 00:01:49.070 Message: lib/table: Defining dependency "table" 00:01:49.070 Message: lib/pipeline: Defining dependency "pipeline" 00:01:49.070 Message: lib/graph: Defining dependency "graph" 00:01:49.070 Message: lib/node: Defining dependency "node" 00:01:49.070 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:50.019 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:50.019 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:50.019 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:50.019 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:50.019 Compiler for C supports arguments -Wno-unused-value: YES 00:01:50.019 Compiler for C supports arguments -Wno-format: YES 00:01:50.019 Compiler for C supports arguments -Wno-format-security: YES 00:01:50.019 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:50.019 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:50.019 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:50.019 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:50.019 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:50.019 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:50.020 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:50.020 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:50.020 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:50.020 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:50.020 Has header "sys/epoll.h" : YES 00:01:50.020 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:50.020 Configuring doxy-api-html.conf using configuration 00:01:50.020 Configuring doxy-api-man.conf using configuration 00:01:50.020 Program mandb found: YES (/usr/bin/mandb) 00:01:50.020 Program sphinx-build found: NO 00:01:50.020 Configuring rte_build_config.h using configuration 00:01:50.020 Message: 00:01:50.020 ================= 00:01:50.020 Applications Enabled 00:01:50.020 ================= 00:01:50.020 00:01:50.020 apps: 00:01:50.020 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:50.020 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:50.020 test-pmd, test-regex, test-sad, test-security-perf, 00:01:50.020 00:01:50.020 Message: 00:01:50.020 ================= 00:01:50.020 Libraries Enabled 00:01:50.020 ================= 00:01:50.020 00:01:50.020 libs: 00:01:50.020 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:50.020 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:01:50.020 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:01:50.020 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:01:50.020 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:01:50.020 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:01:50.020 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:01:50.020 00:01:50.020 00:01:50.020 Message: 00:01:50.020 =============== 00:01:50.020 Drivers Enabled 00:01:50.020 =============== 00:01:50.020 00:01:50.020 common: 00:01:50.020 00:01:50.020 bus: 00:01:50.020 pci, vdev, 00:01:50.020 mempool: 00:01:50.020 ring, 00:01:50.020 dma: 00:01:50.020 00:01:50.020 net: 00:01:50.020 i40e, 00:01:50.020 raw: 00:01:50.020 00:01:50.020 crypto: 00:01:50.020 00:01:50.020 compress: 00:01:50.020 00:01:50.020 regex: 00:01:50.020 00:01:50.020 ml: 00:01:50.020 00:01:50.020 vdpa: 00:01:50.020 00:01:50.020 event: 00:01:50.020 00:01:50.020 baseband: 00:01:50.020 00:01:50.020 gpu: 00:01:50.020 00:01:50.020 00:01:50.020 Message: 00:01:50.020 ================= 00:01:50.020 Content Skipped 00:01:50.020 ================= 00:01:50.020 00:01:50.020 apps: 00:01:50.020 00:01:50.020 libs: 00:01:50.020 00:01:50.020 drivers: 00:01:50.020 common/cpt: not in enabled drivers build config 00:01:50.020 common/dpaax: not in enabled drivers build config 00:01:50.020 common/iavf: not in enabled drivers build config 00:01:50.020 common/idpf: not in enabled drivers build config 00:01:50.020 common/mvep: not in enabled drivers build config 00:01:50.020 common/octeontx: not in enabled drivers build config 00:01:50.020 bus/auxiliary: not in enabled drivers build config 00:01:50.020 bus/cdx: not in enabled drivers build config 00:01:50.020 bus/dpaa: not in enabled drivers build config 00:01:50.020 bus/fslmc: not in enabled drivers build config 00:01:50.020 bus/ifpga: not in enabled drivers build config 00:01:50.020 bus/platform: not in enabled drivers build config 00:01:50.020 bus/vmbus: not in enabled drivers build config 00:01:50.020 common/cnxk: not in enabled drivers build config 00:01:50.020 common/mlx5: not in enabled drivers build config 00:01:50.020 common/nfp: not in enabled drivers build config 00:01:50.020 common/qat: not in enabled drivers build config 00:01:50.020 common/sfc_efx: not in enabled drivers build config 00:01:50.020 mempool/bucket: not in enabled drivers build config 00:01:50.020 mempool/cnxk: not in enabled drivers build config 00:01:50.020 mempool/dpaa: not in enabled drivers build config 00:01:50.020 mempool/dpaa2: not in enabled drivers build config 00:01:50.020 mempool/octeontx: not in enabled drivers build config 00:01:50.020 mempool/stack: not in enabled drivers build config 00:01:50.020 dma/cnxk: not in enabled drivers build config 00:01:50.020 dma/dpaa: not in enabled drivers build config 00:01:50.020 dma/dpaa2: not in enabled drivers build config 00:01:50.020 dma/hisilicon: not in enabled drivers build config 00:01:50.020 dma/idxd: not in enabled drivers build config 00:01:50.020 dma/ioat: not in enabled drivers build config 00:01:50.020 dma/skeleton: not in enabled drivers build config 00:01:50.020 net/af_packet: not in enabled drivers build config 00:01:50.020 net/af_xdp: not in enabled drivers build config 00:01:50.020 net/ark: not in enabled drivers build config 00:01:50.020 net/atlantic: not in enabled drivers build config 00:01:50.020 net/avp: not in enabled drivers build config 00:01:50.020 net/axgbe: not in enabled drivers build config 00:01:50.020 net/bnx2x: not in enabled drivers build config 00:01:50.020 net/bnxt: not in enabled drivers build config 00:01:50.020 net/bonding: not in enabled drivers build config 00:01:50.020 net/cnxk: not in enabled drivers build config 00:01:50.020 net/cpfl: not in enabled drivers build config 00:01:50.020 net/cxgbe: not in enabled drivers build config 00:01:50.020 net/dpaa: not in enabled drivers build config 00:01:50.020 net/dpaa2: not in enabled drivers build config 00:01:50.020 net/e1000: not in enabled drivers build config 00:01:50.020 net/ena: not in enabled drivers build config 00:01:50.020 net/enetc: not in enabled drivers build config 00:01:50.020 net/enetfec: not in enabled drivers build config 00:01:50.020 net/enic: not in enabled drivers build config 00:01:50.020 net/failsafe: not in enabled drivers build config 00:01:50.020 net/fm10k: not in enabled drivers build config 00:01:50.020 net/gve: not in enabled drivers build config 00:01:50.020 net/hinic: not in enabled drivers build config 00:01:50.020 net/hns3: not in enabled drivers build config 00:01:50.020 net/iavf: not in enabled drivers build config 00:01:50.020 net/ice: not in enabled drivers build config 00:01:50.020 net/idpf: not in enabled drivers build config 00:01:50.020 net/igc: not in enabled drivers build config 00:01:50.020 net/ionic: not in enabled drivers build config 00:01:50.020 net/ipn3ke: not in enabled drivers build config 00:01:50.020 net/ixgbe: not in enabled drivers build config 00:01:50.020 net/mana: not in enabled drivers build config 00:01:50.020 net/memif: not in enabled drivers build config 00:01:50.020 net/mlx4: not in enabled drivers build config 00:01:50.020 net/mlx5: not in enabled drivers build config 00:01:50.020 net/mvneta: not in enabled drivers build config 00:01:50.020 net/mvpp2: not in enabled drivers build config 00:01:50.020 net/netvsc: not in enabled drivers build config 00:01:50.020 net/nfb: not in enabled drivers build config 00:01:50.020 net/nfp: not in enabled drivers build config 00:01:50.020 net/ngbe: not in enabled drivers build config 00:01:50.020 net/null: not in enabled drivers build config 00:01:50.020 net/octeontx: not in enabled drivers build config 00:01:50.020 net/octeon_ep: not in enabled drivers build config 00:01:50.020 net/pcap: not in enabled drivers build config 00:01:50.020 net/pfe: not in enabled drivers build config 00:01:50.020 net/qede: not in enabled drivers build config 00:01:50.020 net/ring: not in enabled drivers build config 00:01:50.020 net/sfc: not in enabled drivers build config 00:01:50.020 net/softnic: not in enabled drivers build config 00:01:50.020 net/tap: not in enabled drivers build config 00:01:50.020 net/thunderx: not in enabled drivers build config 00:01:50.020 net/txgbe: not in enabled drivers build config 00:01:50.020 net/vdev_netvsc: not in enabled drivers build config 00:01:50.020 net/vhost: not in enabled drivers build config 00:01:50.020 net/virtio: not in enabled drivers build config 00:01:50.020 net/vmxnet3: not in enabled drivers build config 00:01:50.020 raw/cnxk_bphy: not in enabled drivers build config 00:01:50.020 raw/cnxk_gpio: not in enabled drivers build config 00:01:50.020 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:50.020 raw/ifpga: not in enabled drivers build config 00:01:50.020 raw/ntb: not in enabled drivers build config 00:01:50.020 raw/skeleton: not in enabled drivers build config 00:01:50.020 crypto/armv8: not in enabled drivers build config 00:01:50.020 crypto/bcmfs: not in enabled drivers build config 00:01:50.020 crypto/caam_jr: not in enabled drivers build config 00:01:50.020 crypto/ccp: not in enabled drivers build config 00:01:50.020 crypto/cnxk: not in enabled drivers build config 00:01:50.020 crypto/dpaa_sec: not in enabled drivers build config 00:01:50.020 crypto/dpaa2_sec: not in enabled drivers build config 00:01:50.020 crypto/ipsec_mb: not in enabled drivers build config 00:01:50.020 crypto/mlx5: not in enabled drivers build config 00:01:50.020 crypto/mvsam: not in enabled drivers build config 00:01:50.020 crypto/nitrox: not in enabled drivers build config 00:01:50.020 crypto/null: not in enabled drivers build config 00:01:50.020 crypto/octeontx: not in enabled drivers build config 00:01:50.020 crypto/openssl: not in enabled drivers build config 00:01:50.020 crypto/scheduler: not in enabled drivers build config 00:01:50.020 crypto/uadk: not in enabled drivers build config 00:01:50.020 crypto/virtio: not in enabled drivers build config 00:01:50.020 compress/isal: not in enabled drivers build config 00:01:50.020 compress/mlx5: not in enabled drivers build config 00:01:50.020 compress/octeontx: not in enabled drivers build config 00:01:50.020 compress/zlib: not in enabled drivers build config 00:01:50.020 regex/mlx5: not in enabled drivers build config 00:01:50.020 regex/cn9k: not in enabled drivers build config 00:01:50.020 ml/cnxk: not in enabled drivers build config 00:01:50.020 vdpa/ifc: not in enabled drivers build config 00:01:50.020 vdpa/mlx5: not in enabled drivers build config 00:01:50.020 vdpa/nfp: not in enabled drivers build config 00:01:50.021 vdpa/sfc: not in enabled drivers build config 00:01:50.021 event/cnxk: not in enabled drivers build config 00:01:50.021 event/dlb2: not in enabled drivers build config 00:01:50.021 event/dpaa: not in enabled drivers build config 00:01:50.021 event/dpaa2: not in enabled drivers build config 00:01:50.021 event/dsw: not in enabled drivers build config 00:01:50.021 event/opdl: not in enabled drivers build config 00:01:50.021 event/skeleton: not in enabled drivers build config 00:01:50.021 event/sw: not in enabled drivers build config 00:01:50.021 event/octeontx: not in enabled drivers build config 00:01:50.021 baseband/acc: not in enabled drivers build config 00:01:50.021 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:50.021 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:50.021 baseband/la12xx: not in enabled drivers build config 00:01:50.021 baseband/null: not in enabled drivers build config 00:01:50.021 baseband/turbo_sw: not in enabled drivers build config 00:01:50.021 gpu/cuda: not in enabled drivers build config 00:01:50.021 00:01:50.021 00:01:50.021 Build targets in project: 217 00:01:50.021 00:01:50.021 DPDK 23.11.0 00:01:50.021 00:01:50.021 User defined options 00:01:50.021 libdir : lib 00:01:50.021 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:50.021 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:50.021 c_link_args : 00:01:50.021 enable_docs : false 00:01:50.021 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:01:50.021 enable_kmods : false 00:01:50.021 machine : native 00:01:50.021 tests : false 00:01:50.021 00:01:50.021 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:50.021 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:50.021 15:03:28 build_native_dpdk -- common/autobuild_common.sh@199 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j72 00:01:50.021 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:50.021 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:50.021 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:50.283 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:50.283 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:50.283 [5/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:50.283 [6/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:50.283 [7/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:50.283 [8/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:50.283 [9/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:50.283 [10/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:50.283 [11/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:50.283 [12/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:50.283 [13/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:50.283 [14/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:50.283 [15/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:50.283 [16/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:50.283 [17/707] Linking static target lib/librte_kvargs.a 00:01:50.283 [18/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:50.283 [19/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:50.283 [20/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:50.283 [21/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:50.283 [22/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:50.283 [23/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:50.283 [24/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:50.283 [25/707] Linking static target lib/librte_log.a 00:01:50.546 [26/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.546 [27/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:50.546 [28/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:50.546 [29/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:50.546 [30/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:50.807 [31/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:50.807 [32/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:50.807 [33/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:50.807 [34/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:50.807 [35/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:50.807 [36/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:50.807 [37/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:50.807 [38/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:50.807 [39/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:50.807 [40/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:50.807 [41/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:50.807 [42/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:50.807 [43/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:50.807 [44/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:50.807 [45/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:50.807 [46/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:50.807 [47/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:50.807 [48/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:50.807 [49/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:50.807 [50/707] Linking static target lib/librte_ring.a 00:01:50.808 [51/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:50.808 [52/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:50.808 [53/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:50.808 [54/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:50.808 [55/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:50.808 [56/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:50.808 [57/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:50.808 [58/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:50.808 [59/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:50.808 [60/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:50.808 [61/707] Linking static target lib/librte_pci.a 00:01:50.808 [62/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:50.808 [63/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:50.808 [64/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:50.808 [65/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:50.808 [66/707] Linking static target lib/librte_meter.a 00:01:50.808 [67/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:50.808 [68/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:50.808 [69/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:50.808 [70/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:50.808 [71/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:50.808 [72/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:50.808 [73/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:50.808 [74/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:50.808 [75/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:50.808 [76/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:50.808 [77/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:50.808 [78/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:50.808 [79/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:50.808 [80/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:51.069 [81/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:51.069 [82/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:51.069 [83/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:51.069 [84/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:51.069 [85/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:51.069 [86/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:51.069 [87/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:51.069 [88/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:51.069 [89/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.069 [90/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:51.069 [91/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:51.069 [92/707] Linking static target lib/librte_net.a 00:01:51.069 [93/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:51.069 [94/707] Linking target lib/librte_log.so.24.0 00:01:51.069 [95/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:51.069 [96/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:51.069 [97/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:51.069 [98/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:51.069 [99/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:51.069 [100/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:51.069 [101/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:51.328 [102/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:51.328 [103/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:51.328 [104/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:51.328 [105/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.328 [106/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.328 [107/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.328 [108/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:51.328 [109/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:51.328 [110/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:51.328 [111/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:51.328 [112/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:51.328 [113/707] Linking target lib/librte_kvargs.so.24.0 00:01:51.328 [114/707] Linking static target lib/librte_cfgfile.a 00:01:51.328 [115/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:51.328 [116/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:51.328 [117/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:51.328 [118/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:51.328 [119/707] Linking static target lib/librte_cmdline.a 00:01:51.328 [120/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:51.328 [121/707] Linking static target lib/librte_mempool.a 00:01:51.328 [122/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.589 [123/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:51.589 [124/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:51.589 [125/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:51.589 [126/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:51.589 [127/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:51.589 [128/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:51.589 [129/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:51.589 [130/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:51.589 [131/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:51.589 [132/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:51.589 [133/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:51.589 [134/707] Linking static target lib/librte_metrics.a 00:01:51.589 [135/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:51.589 [136/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:51.589 [137/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:51.589 [138/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:51.589 [139/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:51.589 [140/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:51.589 [141/707] Linking static target lib/librte_bitratestats.a 00:01:51.590 [142/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:51.854 [143/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:51.854 [144/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:51.854 [145/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:51.854 [146/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:51.854 [147/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:51.854 [148/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:51.854 [149/707] Linking static target lib/librte_telemetry.a 00:01:51.854 [150/707] Linking static target lib/librte_eal.a 00:01:51.854 [151/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:51.854 [152/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:51.854 [153/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:51.854 [154/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:51.854 [155/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:51.854 [156/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:51.854 [157/707] Linking static target lib/librte_compressdev.a 00:01:51.854 [158/707] Linking static target lib/librte_rcu.a 00:01:51.854 [159/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.854 [160/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:51.854 [161/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.854 [162/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:51.854 [163/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:51.854 [164/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:51.854 [165/707] Linking static target lib/librte_timer.a 00:01:51.854 [166/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:51.854 [167/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:52.115 [168/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:52.115 [169/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:52.115 [170/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:52.115 [171/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:52.115 [172/707] Linking static target lib/librte_bbdev.a 00:01:52.115 [173/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:52.115 [174/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:52.115 [175/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:52.115 [176/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:52.115 [177/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.115 [178/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:52.115 [179/707] Linking static target lib/librte_mbuf.a 00:01:52.115 [180/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:52.115 [181/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:52.115 [182/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:52.115 [183/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:52.377 [184/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:52.377 [185/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:52.377 [186/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:52.377 [187/707] Linking static target lib/librte_dispatcher.a 00:01:52.377 [188/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:52.377 [189/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:52.377 [190/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:52.377 [191/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:52.377 [192/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:52.377 [193/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:52.377 [194/707] Linking static target lib/librte_jobstats.a 00:01:52.377 [195/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:52.377 [196/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.377 [197/707] Linking static target lib/librte_dmadev.a 00:01:52.377 [198/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:52.377 [199/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:52.377 [200/707] Linking static target lib/librte_distributor.a 00:01:52.377 [201/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.377 [202/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:52.377 [203/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:52.377 [204/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:52.377 [205/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.377 [206/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:52.377 [207/707] Linking static target lib/librte_gro.a 00:01:52.639 [208/707] Linking static target lib/librte_gpudev.a 00:01:52.639 [209/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.639 [210/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:52.639 [211/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:52.639 [212/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:52.639 [213/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:52.639 [214/707] Linking target lib/librte_telemetry.so.24.0 00:01:52.639 [215/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:52.639 [216/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:52.639 [217/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:52.639 [218/707] Linking static target lib/librte_gso.a 00:01:52.639 [219/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:52.639 [220/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:52.639 [221/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:52.639 [222/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:52.639 [223/707] Linking static target lib/librte_latencystats.a 00:01:52.639 [224/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:52.639 [225/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:52.639 [226/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.639 [227/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:52.639 [228/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:52.639 [229/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:52.639 [230/707] Linking static target lib/librte_bpf.a 00:01:52.639 [231/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:52.639 [232/707] Linking static target lib/librte_ip_frag.a 00:01:52.639 [233/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.901 [234/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:52.901 [235/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:52.901 [236/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:52.901 [237/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:52.901 [238/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:52.901 [239/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:52.901 [240/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.901 [241/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.901 [242/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:52.901 [243/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:52.901 [244/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.901 [245/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.901 [246/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:52.901 [247/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:52.901 [248/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:52.901 [249/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:52.901 [250/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.901 [251/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.168 [252/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.168 [253/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:53.168 [254/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.168 [255/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:53.168 [256/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:53.168 [257/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.168 [258/707] Linking static target lib/librte_regexdev.a 00:01:53.168 [259/707] Linking static target lib/librte_stack.a 00:01:53.168 [260/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:53.168 [261/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:53.168 [262/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:53.168 [263/707] Linking static target lib/librte_pcapng.a 00:01:53.168 [264/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:53.168 [265/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.168 [266/707] Linking static target lib/librte_rawdev.a 00:01:53.168 [267/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:53.168 [268/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:53.168 [269/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:53.168 [270/707] Linking static target lib/librte_power.a 00:01:53.168 [271/707] Linking static target lib/librte_mldev.a 00:01:53.168 [272/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.168 [273/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:53.168 [274/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:53.432 [275/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:53.432 [276/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:53.432 [277/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:53.432 [278/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:53.432 [279/707] Linking static target lib/librte_efd.a 00:01:53.432 [280/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.432 [281/707] Linking static target lib/librte_lpm.a 00:01:53.432 [282/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:53.432 [283/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:53.433 [284/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:53.433 [285/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:53.433 [286/707] Linking static target lib/librte_security.a 00:01:53.433 [287/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:53.433 [288/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:53.433 [289/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:53.433 [290/707] Linking static target lib/librte_reorder.a 00:01:53.433 [291/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:53.433 [292/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:53.433 [293/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:53.433 [294/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.433 [295/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:53.433 [296/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:53.702 [297/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:53.702 [298/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:53.702 [299/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:53.702 [300/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:53.702 [301/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:53.702 [302/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:53.702 [303/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:53.702 [304/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:53.702 [305/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:53.702 [306/707] Linking static target lib/librte_rib.a 00:01:53.702 [307/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:53.702 [308/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:53.702 [309/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:53.702 [310/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:53.702 [311/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.962 [312/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.962 [313/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:53.962 [314/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.962 [315/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:53.962 [316/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.962 [317/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.222 [318/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:54.222 [319/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:54.222 [320/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:54.222 [321/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:54.222 [322/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:54.222 [323/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.222 [324/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:54.222 [325/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:54.222 [326/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.222 [327/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:54.222 [328/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:54.222 [329/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:54.222 [330/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:54.222 [331/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:54.222 [332/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.222 [333/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:54.222 [334/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:54.222 [335/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:54.222 [336/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:54.484 [337/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:54.484 [338/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:54.484 [339/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:54.484 [340/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:54.484 [341/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:54.484 [342/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:54.484 [343/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:54.484 [344/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:54.484 [345/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:54.484 [346/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:54.484 [347/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:54.484 [348/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.484 [349/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:54.484 [350/707] Linking static target lib/librte_cryptodev.a 00:01:54.484 [351/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:54.484 [352/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:54.484 [353/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:54.484 [354/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:54.484 [355/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:54.484 [356/707] Linking static target lib/librte_fib.a 00:01:54.484 [357/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:54.743 [358/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:54.743 [359/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:54.743 [360/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:54.743 [361/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:54.743 [362/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:54.743 [363/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:54.743 [364/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:54.743 [365/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:54.743 [366/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:55.003 [367/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:55.003 [368/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:55.003 [369/707] Linking static target lib/librte_pdump.a 00:01:55.003 [370/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:55.003 [371/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:55.003 [372/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:55.003 [373/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:55.003 [374/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:55.003 [375/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:55.003 [376/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:55.003 [377/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:55.266 [378/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:55.266 [379/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:55.266 [380/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.266 [381/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:55.266 [382/707] Linking static target lib/acl/libavx2_tmp.a 00:01:55.266 [383/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:55.266 [384/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:55.266 [385/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:55.266 [386/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:55.266 [387/707] Linking static target lib/librte_sched.a 00:01:55.266 [388/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:55.266 [389/707] Linking static target lib/librte_graph.a 00:01:55.266 [390/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:55.266 [391/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:55.266 [392/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.266 [393/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:55.266 [394/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:55.266 [395/707] Linking static target lib/librte_hash.a 00:01:55.266 [396/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:55.266 [397/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:55.266 [398/707] Linking static target lib/librte_member.a 00:01:55.266 [399/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.266 [400/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:55.266 [401/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:55.266 [402/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:55.266 [403/707] Linking static target lib/librte_table.a 00:01:55.266 [404/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:55.527 [405/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:55.527 [406/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:55.527 [407/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:55.527 [408/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:55.527 [409/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:55.527 [410/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:55.527 [411/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:55.527 [412/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:55.527 [413/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:55.527 [414/707] Linking static target drivers/librte_bus_vdev.a 00:01:55.527 [415/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:55.527 [416/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:55.527 [417/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:55.527 [418/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:55.527 [419/707] Linking static target lib/librte_ipsec.a 00:01:55.527 [420/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:55.527 [421/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:55.789 [422/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:55.789 [423/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:55.789 [424/707] Linking static target lib/librte_eventdev.a 00:01:55.789 [425/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:55.789 [426/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:55.789 [427/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.789 [428/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:55.789 [429/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:55.789 [430/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:55.789 [431/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:55.789 [432/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:55.789 [433/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:55.789 [434/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:55.789 [435/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.789 [436/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:55.789 [437/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:55.789 [438/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:55.789 [439/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:55.789 [440/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:55.789 [441/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:56.052 [442/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:56.052 [443/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:56.052 [444/707] Linking static target drivers/librte_bus_pci.a 00:01:56.052 [445/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:56.052 [446/707] Linking static target lib/librte_pdcp.a 00:01:56.052 [447/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:56.052 [448/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:56.052 [449/707] Linking static target lib/librte_acl.a 00:01:56.052 [450/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:56.052 [451/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.052 [452/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:56.052 [453/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:56.052 [454/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:56.052 [455/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:56.052 [456/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:56.052 [457/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:56.052 [458/707] Linking static target lib/librte_port.a 00:01:56.052 [459/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.052 [460/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:56.314 [461/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.314 [462/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:56.314 [463/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.314 [464/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:56.314 [465/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:56.314 [466/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:56.314 [467/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:56.314 [468/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:56.314 [469/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:56.314 [470/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:56.314 [471/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:56.314 [472/707] Linking static target lib/librte_node.a 00:01:56.574 [473/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:56.574 [474/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.574 [475/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:56.574 [476/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.574 [477/707] Linking static target drivers/librte_mempool_ring.a 00:01:56.574 [478/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:56.574 [479/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:56.574 [480/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.574 [481/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:56.574 [482/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:56.574 [483/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.575 [484/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:56.575 [485/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:56.575 [486/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:56.575 [487/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:56.837 [488/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.837 [489/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:56.837 [490/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:56.837 [491/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:56.837 [492/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:56.837 [493/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:56.837 [494/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:56.837 [495/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:56.837 [496/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:56.837 [497/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:56.837 [498/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:56.837 [499/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:56.837 [500/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:57.099 [501/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:57.099 [502/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.099 [503/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:57.099 [504/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:57.099 [505/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:57.099 [506/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:57.099 [507/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.099 [508/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:57.099 [509/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:57.099 [510/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:57.099 [511/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:57.099 [512/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:57.099 [513/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:57.099 [514/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:57.099 [515/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:57.099 [516/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:57.099 [517/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:57.099 [518/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:57.099 [519/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:57.099 [520/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:57.099 [521/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:57.099 [522/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:57.099 [523/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:57.358 [524/707] Linking static target lib/librte_ethdev.a 00:01:57.358 [525/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:57.358 [526/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:57.358 [527/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:57.358 [528/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:57.358 [529/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:57.358 [530/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:57.358 [531/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:57.358 [532/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:57.358 [533/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:57.358 [534/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:57.616 [535/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:57.616 [536/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:57.616 [537/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:57.616 [538/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:57.616 [539/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:57.616 [540/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:57.616 [541/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:57.616 [542/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:57.616 [543/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:57.616 [544/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:57.616 [545/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:57.616 [546/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:57.616 [547/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:57.616 [548/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:57.616 [549/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:57.616 [550/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:57.616 [551/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:57.616 [552/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:57.616 [553/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:57.616 [554/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:57.616 [555/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:57.616 [556/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:57.616 [557/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:57.874 [558/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:57.875 [559/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:57.875 [560/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:57.875 [561/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:57.875 [562/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:57.875 [563/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:57.875 [564/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:57.875 [565/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:57.875 [566/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:58.133 [567/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:58.133 [568/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:58.133 [569/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:58.133 [570/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:58.390 [571/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:58.390 [572/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:58.390 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:58.650 [574/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:01:58.907 [575/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:58.907 [576/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.907 [577/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:59.165 [578/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:59.165 [579/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:59.424 [580/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:59.682 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:59.683 [582/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:59.683 [583/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:59.940 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:59.940 [585/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:59.940 [586/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:59.940 [587/707] Linking static target drivers/librte_net_i40e.a 00:02:00.199 [588/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:00.199 [589/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:01.137 [590/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.137 [591/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:02.070 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:03.004 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.263 [594/707] Linking target lib/librte_eal.so.24.0 00:02:03.263 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:03.263 [596/707] Linking target lib/librte_timer.so.24.0 00:02:03.263 [597/707] Linking target lib/librte_cfgfile.so.24.0 00:02:03.263 [598/707] Linking target lib/librte_pci.so.24.0 00:02:03.263 [599/707] Linking target lib/librte_ring.so.24.0 00:02:03.263 [600/707] Linking target lib/librte_jobstats.so.24.0 00:02:03.263 [601/707] Linking target lib/librte_meter.so.24.0 00:02:03.263 [602/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:03.263 [603/707] Linking target lib/librte_dmadev.so.24.0 00:02:03.263 [604/707] Linking target lib/librte_stack.so.24.0 00:02:03.263 [605/707] Linking target lib/librte_rawdev.so.24.0 00:02:03.263 [606/707] Linking target lib/librte_acl.so.24.0 00:02:03.522 [607/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:03.522 [608/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:03.522 [609/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:03.522 [610/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:03.522 [611/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:03.522 [612/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:03.522 [613/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:03.522 [614/707] Linking target lib/librte_rcu.so.24.0 00:02:03.522 [615/707] Linking target lib/librte_mempool.so.24.0 00:02:03.522 [616/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:03.780 [617/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:03.780 [618/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:03.780 [619/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:03.780 [620/707] Linking target lib/librte_rib.so.24.0 00:02:03.780 [621/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:03.780 [622/707] Linking target lib/librte_mbuf.so.24.0 00:02:03.780 [623/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:03.780 [624/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:04.037 [625/707] Linking target lib/librte_reorder.so.24.0 00:02:04.037 [626/707] Linking target lib/librte_gpudev.so.24.0 00:02:04.038 [627/707] Linking target lib/librte_fib.so.24.0 00:02:04.038 [628/707] Linking target lib/librte_mldev.so.24.0 00:02:04.038 [629/707] Linking target lib/librte_regexdev.so.24.0 00:02:04.038 [630/707] Linking target lib/librte_bbdev.so.24.0 00:02:04.038 [631/707] Linking target lib/librte_net.so.24.0 00:02:04.038 [632/707] Linking target lib/librte_sched.so.24.0 00:02:04.038 [633/707] Linking target lib/librte_cryptodev.so.24.0 00:02:04.038 [634/707] Linking target lib/librte_compressdev.so.24.0 00:02:04.038 [635/707] Linking target lib/librte_distributor.so.24.0 00:02:04.038 [636/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:04.038 [637/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:04.038 [638/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:04.038 [639/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:04.038 [640/707] Linking target lib/librte_cmdline.so.24.0 00:02:04.038 [641/707] Linking target lib/librte_hash.so.24.0 00:02:04.038 [642/707] Linking target lib/librte_security.so.24.0 00:02:04.296 [643/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:04.296 [644/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:04.296 [645/707] Linking target lib/librte_lpm.so.24.0 00:02:04.296 [646/707] Linking target lib/librte_efd.so.24.0 00:02:04.296 [647/707] Linking target lib/librte_member.so.24.0 00:02:04.296 [648/707] Linking target lib/librte_pdcp.so.24.0 00:02:04.296 [649/707] Linking target lib/librte_ipsec.so.24.0 00:02:04.553 [650/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:04.553 [651/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:05.117 [652/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.375 [653/707] Linking target lib/librte_ethdev.so.24.0 00:02:05.375 [654/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:05.375 [655/707] Linking target lib/librte_gso.so.24.0 00:02:05.375 [656/707] Linking target lib/librte_metrics.so.24.0 00:02:05.375 [657/707] Linking target lib/librte_pcapng.so.24.0 00:02:05.375 [658/707] Linking target lib/librte_power.so.24.0 00:02:05.375 [659/707] Linking target lib/librte_gro.so.24.0 00:02:05.375 [660/707] Linking target lib/librte_ip_frag.so.24.0 00:02:05.375 [661/707] Linking target lib/librte_bpf.so.24.0 00:02:05.375 [662/707] Linking target lib/librte_eventdev.so.24.0 00:02:05.632 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:05.632 [664/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:05.632 [665/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:05.632 [666/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:05.632 [667/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:05.632 [668/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:05.632 [669/707] Linking target lib/librte_latencystats.so.24.0 00:02:05.632 [670/707] Linking target lib/librte_bitratestats.so.24.0 00:02:05.632 [671/707] Linking target lib/librte_pdump.so.24.0 00:02:05.632 [672/707] Linking target lib/librte_port.so.24.0 00:02:05.632 [673/707] Linking target lib/librte_dispatcher.so.24.0 00:02:05.632 [674/707] Linking target lib/librte_graph.so.24.0 00:02:05.889 [675/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:05.889 [676/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:05.889 [677/707] Linking target lib/librte_table.so.24.0 00:02:05.889 [678/707] Linking target lib/librte_node.so.24.0 00:02:05.889 [679/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:08.418 [680/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:08.418 [681/707] Linking static target lib/librte_pipeline.a 00:02:08.418 [682/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:08.418 [683/707] Linking static target lib/librte_vhost.a 00:02:08.676 [684/707] Linking target app/dpdk-test-acl 00:02:08.676 [685/707] Linking target app/dpdk-test-gpudev 00:02:08.676 [686/707] Linking target app/dpdk-test-dma-perf 00:02:08.676 [687/707] Linking target app/dpdk-proc-info 00:02:08.676 [688/707] Linking target app/dpdk-test-cmdline 00:02:08.676 [689/707] Linking target app/dpdk-test-fib 00:02:08.676 [690/707] Linking target app/dpdk-test-compress-perf 00:02:08.676 [691/707] Linking target app/dpdk-pdump 00:02:08.676 [692/707] Linking target app/dpdk-dumpcap 00:02:08.676 [693/707] Linking target app/dpdk-test-security-perf 00:02:08.676 [694/707] Linking target app/dpdk-test-regex 00:02:08.676 [695/707] Linking target app/dpdk-test-flow-perf 00:02:08.676 [696/707] Linking target app/dpdk-test-sad 00:02:08.676 [697/707] Linking target app/dpdk-test-mldev 00:02:08.676 [698/707] Linking target app/dpdk-test-pipeline 00:02:08.676 [699/707] Linking target app/dpdk-test-crypto-perf 00:02:08.676 [700/707] Linking target app/dpdk-graph 00:02:08.676 [701/707] Linking target app/dpdk-test-bbdev 00:02:08.676 [702/707] Linking target app/dpdk-test-eventdev 00:02:08.934 [703/707] Linking target app/dpdk-testpmd 00:02:10.310 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.310 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:13.593 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.593 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:13.593 15:03:52 build_native_dpdk -- common/autobuild_common.sh@201 -- $ uname -s 00:02:13.593 15:03:52 build_native_dpdk -- common/autobuild_common.sh@201 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:13.593 15:03:52 build_native_dpdk -- common/autobuild_common.sh@214 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j72 install 00:02:13.850 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:13.850 [0/1] Installing files. 00:02:14.113 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:14.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:14.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:14.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:14.119 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.119 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.120 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.120 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.120 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.382 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.382 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.382 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.382 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.382 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.382 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.382 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.382 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.383 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.384 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.385 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.386 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.387 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.387 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.387 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.387 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.387 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.387 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.387 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:14.387 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:14.387 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:14.387 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:14.387 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:14.387 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:14.387 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:14.387 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:14.387 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:14.387 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:14.387 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:14.387 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:14.387 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:14.387 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:14.387 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:14.387 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:14.387 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:14.387 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:14.387 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:14.387 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:14.387 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:14.387 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:14.387 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:14.387 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:14.387 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:14.387 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:14.387 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:14.387 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:14.387 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:14.387 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:14.387 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:14.387 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:14.387 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:14.387 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:14.387 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:14.387 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:14.387 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:14.387 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:14.387 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:14.387 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:14.387 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:14.387 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:14.387 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:14.387 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:14.387 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:14.387 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:14.387 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:14.387 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:14.387 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:14.387 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:14.387 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:14.387 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:14.387 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:14.387 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:14.387 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:14.387 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:14.387 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:14.387 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:14.387 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:14.387 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:14.387 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:14.387 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:14.387 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:14.387 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:14.387 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:14.387 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:14.387 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:14.387 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:14.387 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:14.387 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:14.387 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:14.387 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:14.387 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:14.387 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:14.387 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:14.387 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:14.387 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:14.387 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:14.387 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:14.387 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:14.387 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:14.387 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:14.387 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:14.388 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:14.388 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:14.388 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:14.388 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:14.388 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:14.388 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:14.388 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:14.388 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:14.388 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:14.388 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:14.388 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:14.388 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:14.388 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:14.388 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:14.388 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:14.388 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:14.388 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:14.388 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:14.388 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:14.388 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:14.388 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:14.388 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:14.388 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:14.388 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:14.388 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:14.388 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:14.388 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:14.388 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:14.388 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:14.388 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:14.646 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:14.646 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:14.646 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:14.646 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:14.646 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:14.646 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:14.646 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:14.646 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:14.646 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:14.646 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:14.646 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:14.646 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:14.646 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:14.646 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:14.646 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:14.646 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:14.646 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:14.646 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:14.646 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:14.646 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:14.646 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:14.646 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:14.646 15:03:53 build_native_dpdk -- common/autobuild_common.sh@220 -- $ cat 00:02:14.646 15:03:53 build_native_dpdk -- common/autobuild_common.sh@225 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:14.646 00:02:14.646 real 0m30.971s 00:02:14.646 user 8m33.820s 00:02:14.646 sys 1m58.705s 00:02:14.646 15:03:53 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:14.646 15:03:53 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:14.646 ************************************ 00:02:14.646 END TEST build_native_dpdk 00:02:14.646 ************************************ 00:02:14.646 15:03:53 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:14.646 15:03:53 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:14.646 15:03:53 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:14.646 15:03:53 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:14.646 15:03:53 -- common/autobuild_common.sh@445 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:14.646 15:03:53 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:14.646 15:03:53 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:14.646 15:03:53 -- common/autotest_common.sh@10 -- $ set +x 00:02:14.646 ************************************ 00:02:14.646 START TEST autobuild_llvm_precompile 00:02:14.646 ************************************ 00:02:14.646 15:03:53 autobuild_llvm_precompile -- common/autotest_common.sh@1129 -- $ _llvm_precompile 00:02:14.646 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:14.646 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:14.646 Target: x86_64-redhat-linux-gnu 00:02:14.647 Thread model: posix 00:02:14.647 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:14.647 15:03:53 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:14.905 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:15.163 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.163 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.163 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:15.422 Using 'verbs' RDMA provider 00:02:31.670 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:46.543 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:46.543 Creating mk/config.mk...done. 00:02:46.543 Creating mk/cc.flags.mk...done. 00:02:46.543 Type 'make' to build. 00:02:46.543 00:02:46.543 real 0m30.121s 00:02:46.543 user 0m13.363s 00:02:46.543 sys 0m16.147s 00:02:46.543 15:04:23 autobuild_llvm_precompile -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:46.543 15:04:23 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:02:46.543 ************************************ 00:02:46.543 END TEST autobuild_llvm_precompile 00:02:46.543 ************************************ 00:02:46.543 15:04:23 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:46.543 15:04:23 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:46.543 15:04:23 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:46.543 15:04:23 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:46.543 15:04:23 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:46.543 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:46.543 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:46.543 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:46.543 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:46.543 Using 'verbs' RDMA provider 00:02:59.070 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:09.047 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:09.872 Creating mk/config.mk...done. 00:03:09.872 Creating mk/cc.flags.mk...done. 00:03:09.872 Type 'make' to build. 00:03:09.872 15:04:48 -- spdk/autobuild.sh@70 -- $ run_test make make -j72 00:03:09.872 15:04:48 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:03:09.872 15:04:48 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:03:09.872 15:04:48 -- common/autotest_common.sh@10 -- $ set +x 00:03:09.872 ************************************ 00:03:09.872 START TEST make 00:03:09.872 ************************************ 00:03:09.872 15:04:48 make -- common/autotest_common.sh@1129 -- $ make -j72 00:03:10.130 make[1]: Nothing to be done for 'all'. 00:03:12.043 The Meson build system 00:03:12.043 Version: 1.5.0 00:03:12.043 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:12.043 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:12.043 Build type: native build 00:03:12.043 Project name: libvfio-user 00:03:12.043 Project version: 0.0.1 00:03:12.043 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:12.043 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:12.043 Host machine cpu family: x86_64 00:03:12.043 Host machine cpu: x86_64 00:03:12.043 Run-time dependency threads found: YES 00:03:12.043 Library dl found: YES 00:03:12.043 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:12.043 Run-time dependency json-c found: YES 0.17 00:03:12.043 Run-time dependency cmocka found: YES 1.1.7 00:03:12.043 Program pytest-3 found: NO 00:03:12.043 Program flake8 found: NO 00:03:12.043 Program misspell-fixer found: NO 00:03:12.043 Program restructuredtext-lint found: NO 00:03:12.043 Program valgrind found: YES (/usr/bin/valgrind) 00:03:12.043 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:12.043 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:12.043 Compiler for C supports arguments -Wwrite-strings: YES 00:03:12.043 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:12.043 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:12.043 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:12.043 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:12.043 Build targets in project: 8 00:03:12.043 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:12.043 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:12.043 00:03:12.043 libvfio-user 0.0.1 00:03:12.043 00:03:12.043 User defined options 00:03:12.043 buildtype : debug 00:03:12.043 default_library: static 00:03:12.043 libdir : /usr/local/lib 00:03:12.043 00:03:12.043 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:12.303 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:12.303 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:12.303 [2/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:12.303 [3/36] Compiling C object samples/null.p/null.c.o 00:03:12.303 [4/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:12.303 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:12.303 [6/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:12.303 [7/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:12.303 [8/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:12.303 [9/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:12.303 [10/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:12.303 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:12.303 [12/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:12.303 [13/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:12.303 [14/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:12.303 [15/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:12.303 [16/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:12.303 [17/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:12.303 [18/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:12.303 [19/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:12.303 [20/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:12.303 [21/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:12.303 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:12.303 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:12.303 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:12.303 [25/36] Compiling C object samples/client.p/client.c.o 00:03:12.303 [26/36] Compiling C object samples/server.p/server.c.o 00:03:12.303 [27/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:12.562 [28/36] Linking target samples/client 00:03:12.562 [29/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:12.562 [30/36] Linking target test/unit_tests 00:03:12.562 [31/36] Linking static target lib/libvfio-user.a 00:03:12.562 [32/36] Linking target samples/shadow_ioeventfd_server 00:03:12.562 [33/36] Linking target samples/lspci 00:03:12.562 [34/36] Linking target samples/gpio-pci-idio-16 00:03:12.562 [35/36] Linking target samples/server 00:03:12.562 [36/36] Linking target samples/null 00:03:12.562 INFO: autodetecting backend as ninja 00:03:12.562 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:12.562 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:13.130 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:13.130 ninja: no work to do. 00:03:28.012 CC lib/ut_mock/mock.o 00:03:28.012 CC lib/ut/ut.o 00:03:28.012 CC lib/log/log.o 00:03:28.012 CC lib/log/log_flags.o 00:03:28.012 CC lib/log/log_deprecated.o 00:03:28.012 LIB libspdk_ut_mock.a 00:03:28.012 LIB libspdk_ut.a 00:03:28.012 LIB libspdk_log.a 00:03:28.012 CC lib/dma/dma.o 00:03:28.012 CXX lib/trace_parser/trace.o 00:03:28.012 CC lib/ioat/ioat.o 00:03:28.012 CC lib/util/cpuset.o 00:03:28.012 CC lib/util/base64.o 00:03:28.012 CC lib/util/bit_array.o 00:03:28.012 CC lib/util/crc16.o 00:03:28.012 CC lib/util/crc32.o 00:03:28.012 CC lib/util/crc32c.o 00:03:28.012 CC lib/util/dif.o 00:03:28.012 CC lib/util/crc32_ieee.o 00:03:28.012 CC lib/util/crc64.o 00:03:28.012 CC lib/util/fd.o 00:03:28.012 CC lib/util/fd_group.o 00:03:28.012 CC lib/util/file.o 00:03:28.012 CC lib/util/hexlify.o 00:03:28.012 CC lib/util/iov.o 00:03:28.012 CC lib/util/pipe.o 00:03:28.012 CC lib/util/math.o 00:03:28.012 CC lib/util/net.o 00:03:28.012 CC lib/util/strerror_tls.o 00:03:28.012 CC lib/util/string.o 00:03:28.012 CC lib/util/uuid.o 00:03:28.012 CC lib/util/xor.o 00:03:28.012 CC lib/util/zipf.o 00:03:28.012 CC lib/util/md5.o 00:03:28.012 CC lib/vfio_user/host/vfio_user_pci.o 00:03:28.012 CC lib/vfio_user/host/vfio_user.o 00:03:28.012 LIB libspdk_dma.a 00:03:28.012 LIB libspdk_ioat.a 00:03:28.012 LIB libspdk_vfio_user.a 00:03:28.012 LIB libspdk_util.a 00:03:28.012 LIB libspdk_trace_parser.a 00:03:28.012 CC lib/rdma_utils/rdma_utils.o 00:03:28.012 CC lib/idxd/idxd.o 00:03:28.012 CC lib/env_dpdk/env.o 00:03:28.012 CC lib/env_dpdk/memory.o 00:03:28.012 CC lib/idxd/idxd_user.o 00:03:28.012 CC lib/idxd/idxd_kernel.o 00:03:28.012 CC lib/env_dpdk/init.o 00:03:28.012 CC lib/env_dpdk/pci.o 00:03:28.012 CC lib/env_dpdk/threads.o 00:03:28.012 CC lib/env_dpdk/pci_ioat.o 00:03:28.012 CC lib/env_dpdk/pci_virtio.o 00:03:28.012 CC lib/env_dpdk/pci_vmd.o 00:03:28.012 CC lib/env_dpdk/pci_event.o 00:03:28.012 CC lib/env_dpdk/pci_idxd.o 00:03:28.012 CC lib/env_dpdk/sigbus_handler.o 00:03:28.012 CC lib/env_dpdk/pci_dpdk.o 00:03:28.012 CC lib/json/json_parse.o 00:03:28.013 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:28.013 CC lib/json/json_util.o 00:03:28.013 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:28.013 CC lib/json/json_write.o 00:03:28.013 CC lib/conf/conf.o 00:03:28.013 CC lib/vmd/vmd.o 00:03:28.013 CC lib/vmd/led.o 00:03:28.013 LIB libspdk_conf.a 00:03:28.013 LIB libspdk_rdma_utils.a 00:03:28.013 LIB libspdk_json.a 00:03:28.013 LIB libspdk_idxd.a 00:03:28.013 LIB libspdk_vmd.a 00:03:28.013 CC lib/rdma_provider/common.o 00:03:28.013 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:28.013 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:28.013 CC lib/jsonrpc/jsonrpc_client.o 00:03:28.013 CC lib/jsonrpc/jsonrpc_server.o 00:03:28.013 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:28.013 LIB libspdk_rdma_provider.a 00:03:28.013 LIB libspdk_jsonrpc.a 00:03:28.271 CC lib/rpc/rpc.o 00:03:28.271 LIB libspdk_env_dpdk.a 00:03:28.271 LIB libspdk_rpc.a 00:03:28.530 CC lib/notify/notify.o 00:03:28.530 CC lib/notify/notify_rpc.o 00:03:28.530 CC lib/keyring/keyring.o 00:03:28.530 CC lib/keyring/keyring_rpc.o 00:03:28.790 CC lib/trace/trace_flags.o 00:03:28.790 CC lib/trace/trace_rpc.o 00:03:28.790 CC lib/trace/trace.o 00:03:28.790 LIB libspdk_notify.a 00:03:28.790 LIB libspdk_keyring.a 00:03:28.790 LIB libspdk_trace.a 00:03:29.049 CC lib/sock/sock.o 00:03:29.049 CC lib/sock/sock_rpc.o 00:03:29.049 CC lib/thread/thread.o 00:03:29.049 CC lib/thread/iobuf.o 00:03:29.308 LIB libspdk_sock.a 00:03:29.566 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:29.566 CC lib/nvme/nvme_fabric.o 00:03:29.566 CC lib/nvme/nvme_ctrlr.o 00:03:29.566 CC lib/nvme/nvme_ns_cmd.o 00:03:29.566 CC lib/nvme/nvme_pcie.o 00:03:29.566 CC lib/nvme/nvme_ns.o 00:03:29.566 CC lib/nvme/nvme_pcie_common.o 00:03:29.566 CC lib/nvme/nvme_qpair.o 00:03:29.566 CC lib/nvme/nvme.o 00:03:29.566 CC lib/nvme/nvme_discovery.o 00:03:29.566 CC lib/nvme/nvme_quirks.o 00:03:29.566 CC lib/nvme/nvme_transport.o 00:03:29.566 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:29.566 CC lib/nvme/nvme_opal.o 00:03:29.566 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:29.566 CC lib/nvme/nvme_tcp.o 00:03:29.566 CC lib/nvme/nvme_poll_group.o 00:03:29.566 CC lib/nvme/nvme_io_msg.o 00:03:29.566 CC lib/nvme/nvme_stubs.o 00:03:29.566 CC lib/nvme/nvme_zns.o 00:03:29.566 CC lib/nvme/nvme_auth.o 00:03:29.566 CC lib/nvme/nvme_vfio_user.o 00:03:29.566 CC lib/nvme/nvme_cuse.o 00:03:29.566 CC lib/nvme/nvme_rdma.o 00:03:29.824 LIB libspdk_thread.a 00:03:30.082 CC lib/blob/zeroes.o 00:03:30.082 CC lib/blob/blobstore.o 00:03:30.082 CC lib/blob/request.o 00:03:30.082 CC lib/blob/blob_bs_dev.o 00:03:30.341 CC lib/virtio/virtio.o 00:03:30.341 CC lib/virtio/virtio_vhost_user.o 00:03:30.341 CC lib/virtio/virtio_pci.o 00:03:30.341 CC lib/accel/accel_rpc.o 00:03:30.341 CC lib/virtio/virtio_vfio_user.o 00:03:30.341 CC lib/accel/accel.o 00:03:30.341 CC lib/accel/accel_sw.o 00:03:30.341 CC lib/init/subsystem.o 00:03:30.341 CC lib/init/subsystem_rpc.o 00:03:30.341 CC lib/init/rpc.o 00:03:30.341 CC lib/init/json_config.o 00:03:30.341 CC lib/vfu_tgt/tgt_endpoint.o 00:03:30.341 CC lib/vfu_tgt/tgt_rpc.o 00:03:30.341 CC lib/fsdev/fsdev.o 00:03:30.341 CC lib/fsdev/fsdev_io.o 00:03:30.341 CC lib/fsdev/fsdev_rpc.o 00:03:30.341 LIB libspdk_init.a 00:03:30.341 LIB libspdk_virtio.a 00:03:30.599 LIB libspdk_vfu_tgt.a 00:03:30.599 LIB libspdk_fsdev.a 00:03:30.599 CC lib/event/app.o 00:03:30.599 CC lib/event/reactor.o 00:03:30.599 CC lib/event/app_rpc.o 00:03:30.599 CC lib/event/log_rpc.o 00:03:30.599 CC lib/event/scheduler_static.o 00:03:30.857 LIB libspdk_event.a 00:03:30.857 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:31.115 LIB libspdk_accel.a 00:03:31.115 LIB libspdk_nvme.a 00:03:31.374 LIB libspdk_fuse_dispatcher.a 00:03:31.374 CC lib/bdev/bdev.o 00:03:31.374 CC lib/bdev/part.o 00:03:31.374 CC lib/bdev/bdev_rpc.o 00:03:31.374 CC lib/bdev/bdev_zone.o 00:03:31.374 CC lib/bdev/scsi_nvme.o 00:03:31.941 LIB libspdk_blob.a 00:03:32.199 CC lib/lvol/lvol.o 00:03:32.199 CC lib/blobfs/blobfs.o 00:03:32.199 CC lib/blobfs/tree.o 00:03:32.766 LIB libspdk_lvol.a 00:03:32.766 LIB libspdk_blobfs.a 00:03:33.024 LIB libspdk_bdev.a 00:03:33.293 CC lib/ublk/ublk.o 00:03:33.293 CC lib/ublk/ublk_rpc.o 00:03:33.293 CC lib/nvmf/ctrlr_discovery.o 00:03:33.293 CC lib/nvmf/ctrlr.o 00:03:33.293 CC lib/ftl/ftl_core.o 00:03:33.293 CC lib/nvmf/ctrlr_bdev.o 00:03:33.293 CC lib/nvmf/subsystem.o 00:03:33.293 CC lib/ftl/ftl_init.o 00:03:33.293 CC lib/nvmf/nvmf.o 00:03:33.293 CC lib/ftl/ftl_layout.o 00:03:33.293 CC lib/nvmf/nvmf_rpc.o 00:03:33.293 CC lib/ftl/ftl_debug.o 00:03:33.293 CC lib/nvmf/transport.o 00:03:33.293 CC lib/nvmf/mdns_server.o 00:03:33.293 CC lib/ftl/ftl_sb.o 00:03:33.293 CC lib/nvmf/tcp.o 00:03:33.293 CC lib/ftl/ftl_io.o 00:03:33.293 CC lib/nvmf/rdma.o 00:03:33.293 CC lib/nvmf/stubs.o 00:03:33.293 CC lib/nvmf/vfio_user.o 00:03:33.293 CC lib/ftl/ftl_l2p.o 00:03:33.293 CC lib/ftl/ftl_l2p_flat.o 00:03:33.293 CC lib/nvmf/auth.o 00:03:33.293 CC lib/ftl/ftl_band.o 00:03:33.293 CC lib/ftl/ftl_nv_cache.o 00:03:33.293 CC lib/ftl/ftl_band_ops.o 00:03:33.293 CC lib/ftl/ftl_writer.o 00:03:33.293 CC lib/ftl/ftl_rq.o 00:03:33.293 CC lib/ftl/ftl_reloc.o 00:03:33.293 CC lib/ftl/ftl_l2p_cache.o 00:03:33.293 CC lib/ftl/ftl_p2l.o 00:03:33.293 CC lib/ftl/ftl_p2l_log.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:33.293 CC lib/nbd/nbd.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:33.293 CC lib/nbd/nbd_rpc.o 00:03:33.293 CC lib/scsi/dev.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:33.293 CC lib/scsi/lun.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:33.293 CC lib/scsi/port.o 00:03:33.293 CC lib/scsi/scsi.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:33.293 CC lib/scsi/scsi_pr.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:33.293 CC lib/scsi/scsi_bdev.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:33.293 CC lib/scsi/scsi_rpc.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:33.293 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:33.293 CC lib/scsi/task.o 00:03:33.293 CC lib/ftl/utils/ftl_conf.o 00:03:33.293 CC lib/ftl/utils/ftl_md.o 00:03:33.293 CC lib/ftl/utils/ftl_mempool.o 00:03:33.293 CC lib/ftl/utils/ftl_bitmap.o 00:03:33.293 CC lib/ftl/utils/ftl_property.o 00:03:33.293 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:33.293 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:33.293 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:33.293 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:33.293 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:33.293 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:33.293 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:33.293 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:33.293 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:33.293 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:33.293 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:33.293 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:33.552 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:33.552 CC lib/ftl/base/ftl_base_dev.o 00:03:33.552 CC lib/ftl/base/ftl_base_bdev.o 00:03:33.552 CC lib/ftl/ftl_trace.o 00:03:33.810 LIB libspdk_nbd.a 00:03:33.810 LIB libspdk_scsi.a 00:03:34.142 LIB libspdk_ublk.a 00:03:34.142 LIB libspdk_ftl.a 00:03:34.142 CC lib/iscsi/conn.o 00:03:34.142 CC lib/iscsi/init_grp.o 00:03:34.142 CC lib/iscsi/iscsi.o 00:03:34.142 CC lib/iscsi/tgt_node.o 00:03:34.143 CC lib/iscsi/param.o 00:03:34.143 CC lib/iscsi/portal_grp.o 00:03:34.143 CC lib/iscsi/iscsi_subsystem.o 00:03:34.143 CC lib/iscsi/iscsi_rpc.o 00:03:34.143 CC lib/iscsi/task.o 00:03:34.143 CC lib/vhost/vhost.o 00:03:34.143 CC lib/vhost/vhost_rpc.o 00:03:34.143 CC lib/vhost/vhost_scsi.o 00:03:34.143 CC lib/vhost/vhost_blk.o 00:03:34.143 CC lib/vhost/rte_vhost_user.o 00:03:34.855 LIB libspdk_nvmf.a 00:03:34.855 LIB libspdk_vhost.a 00:03:35.168 LIB libspdk_iscsi.a 00:03:35.440 CC module/vfu_device/vfu_virtio_blk.o 00:03:35.440 CC module/vfu_device/vfu_virtio.o 00:03:35.440 CC module/vfu_device/vfu_virtio_rpc.o 00:03:35.440 CC module/vfu_device/vfu_virtio_scsi.o 00:03:35.440 CC module/vfu_device/vfu_virtio_fs.o 00:03:35.440 CC module/env_dpdk/env_dpdk_rpc.o 00:03:35.440 CC module/blob/bdev/blob_bdev.o 00:03:35.440 CC module/accel/dsa/accel_dsa_rpc.o 00:03:35.440 CC module/accel/dsa/accel_dsa.o 00:03:35.440 CC module/accel/error/accel_error_rpc.o 00:03:35.440 CC module/accel/error/accel_error.o 00:03:35.440 CC module/sock/posix/posix.o 00:03:35.440 CC module/fsdev/aio/fsdev_aio.o 00:03:35.440 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:35.440 CC module/fsdev/aio/linux_aio_mgr.o 00:03:35.440 CC module/accel/ioat/accel_ioat.o 00:03:35.440 CC module/accel/ioat/accel_ioat_rpc.o 00:03:35.440 CC module/keyring/linux/keyring.o 00:03:35.440 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:35.699 CC module/keyring/linux/keyring_rpc.o 00:03:35.699 LIB libspdk_env_dpdk_rpc.a 00:03:35.699 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:35.699 CC module/scheduler/gscheduler/gscheduler.o 00:03:35.699 CC module/keyring/file/keyring_rpc.o 00:03:35.699 CC module/keyring/file/keyring.o 00:03:35.699 CC module/accel/iaa/accel_iaa.o 00:03:35.699 CC module/accel/iaa/accel_iaa_rpc.o 00:03:35.699 LIB libspdk_keyring_linux.a 00:03:35.699 LIB libspdk_scheduler_dpdk_governor.a 00:03:35.699 LIB libspdk_scheduler_gscheduler.a 00:03:35.699 LIB libspdk_keyring_file.a 00:03:35.699 LIB libspdk_accel_error.a 00:03:35.699 LIB libspdk_accel_ioat.a 00:03:35.699 LIB libspdk_scheduler_dynamic.a 00:03:35.699 LIB libspdk_blob_bdev.a 00:03:35.699 LIB libspdk_accel_iaa.a 00:03:35.699 LIB libspdk_accel_dsa.a 00:03:35.958 LIB libspdk_vfu_device.a 00:03:35.958 LIB libspdk_sock_posix.a 00:03:35.958 LIB libspdk_fsdev_aio.a 00:03:36.218 CC module/bdev/gpt/gpt.o 00:03:36.218 CC module/bdev/gpt/vbdev_gpt.o 00:03:36.218 CC module/bdev/lvol/vbdev_lvol.o 00:03:36.218 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:36.218 CC module/bdev/malloc/bdev_malloc.o 00:03:36.218 CC module/bdev/ftl/bdev_ftl.o 00:03:36.218 CC module/bdev/passthru/vbdev_passthru.o 00:03:36.218 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:36.218 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:36.218 CC module/bdev/aio/bdev_aio.o 00:03:36.218 CC module/bdev/aio/bdev_aio_rpc.o 00:03:36.218 CC module/bdev/delay/vbdev_delay.o 00:03:36.218 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:36.218 CC module/bdev/error/vbdev_error.o 00:03:36.218 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:36.218 CC module/bdev/error/vbdev_error_rpc.o 00:03:36.218 CC module/bdev/nvme/bdev_nvme.o 00:03:36.218 CC module/bdev/raid/bdev_raid.o 00:03:36.218 CC module/bdev/raid/bdev_raid_rpc.o 00:03:36.218 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:36.218 CC module/bdev/raid/bdev_raid_sb.o 00:03:36.218 CC module/bdev/nvme/vbdev_opal.o 00:03:36.218 CC module/bdev/raid/raid0.o 00:03:36.218 CC module/bdev/raid/concat.o 00:03:36.218 CC module/bdev/nvme/bdev_mdns_client.o 00:03:36.218 CC module/bdev/nvme/nvme_rpc.o 00:03:36.218 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:36.218 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:36.218 CC module/bdev/raid/raid1.o 00:03:36.218 CC module/blobfs/bdev/blobfs_bdev.o 00:03:36.218 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:36.218 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:36.218 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:36.218 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:36.218 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:36.218 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:36.218 CC module/bdev/null/bdev_null_rpc.o 00:03:36.218 CC module/bdev/null/bdev_null.o 00:03:36.218 CC module/bdev/iscsi/bdev_iscsi.o 00:03:36.218 CC module/bdev/split/vbdev_split_rpc.o 00:03:36.218 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:36.218 CC module/bdev/split/vbdev_split.o 00:03:36.218 LIB libspdk_bdev_gpt.a 00:03:36.477 LIB libspdk_bdev_error.a 00:03:36.477 LIB libspdk_bdev_ftl.a 00:03:36.477 LIB libspdk_blobfs_bdev.a 00:03:36.477 LIB libspdk_bdev_aio.a 00:03:36.477 LIB libspdk_bdev_split.a 00:03:36.477 LIB libspdk_bdev_delay.a 00:03:36.477 LIB libspdk_bdev_zone_block.a 00:03:36.477 LIB libspdk_bdev_malloc.a 00:03:36.477 LIB libspdk_bdev_iscsi.a 00:03:36.477 LIB libspdk_bdev_passthru.a 00:03:36.477 LIB libspdk_bdev_null.a 00:03:36.477 LIB libspdk_bdev_lvol.a 00:03:36.737 LIB libspdk_bdev_virtio.a 00:03:36.737 LIB libspdk_bdev_raid.a 00:03:37.676 LIB libspdk_bdev_nvme.a 00:03:38.245 CC module/event/subsystems/sock/sock.o 00:03:38.245 CC module/event/subsystems/fsdev/fsdev.o 00:03:38.245 CC module/event/subsystems/scheduler/scheduler.o 00:03:38.245 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:38.245 CC module/event/subsystems/keyring/keyring.o 00:03:38.245 CC module/event/subsystems/vmd/vmd.o 00:03:38.245 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:38.245 CC module/event/subsystems/iobuf/iobuf.o 00:03:38.245 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:38.245 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:38.505 LIB libspdk_event_fsdev.a 00:03:38.505 LIB libspdk_event_sock.a 00:03:38.505 LIB libspdk_event_keyring.a 00:03:38.505 LIB libspdk_event_vfu_tgt.a 00:03:38.505 LIB libspdk_event_vmd.a 00:03:38.505 LIB libspdk_event_scheduler.a 00:03:38.505 LIB libspdk_event_iobuf.a 00:03:38.505 LIB libspdk_event_vhost_blk.a 00:03:38.764 CC module/event/subsystems/accel/accel.o 00:03:38.764 LIB libspdk_event_accel.a 00:03:39.023 CC module/event/subsystems/bdev/bdev.o 00:03:39.283 LIB libspdk_event_bdev.a 00:03:39.543 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:39.543 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:39.543 CC module/event/subsystems/ublk/ublk.o 00:03:39.543 CC module/event/subsystems/nbd/nbd.o 00:03:39.543 CC module/event/subsystems/scsi/scsi.o 00:03:39.543 LIB libspdk_event_ublk.a 00:03:39.543 LIB libspdk_event_nbd.a 00:03:39.803 LIB libspdk_event_scsi.a 00:03:39.803 LIB libspdk_event_nvmf.a 00:03:40.062 CC module/event/subsystems/iscsi/iscsi.o 00:03:40.062 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:40.062 LIB libspdk_event_iscsi.a 00:03:40.062 LIB libspdk_event_vhost_scsi.a 00:03:40.321 CC app/spdk_nvme_perf/perf.o 00:03:40.321 CXX app/trace/trace.o 00:03:40.321 TEST_HEADER include/spdk/accel.h 00:03:40.321 TEST_HEADER include/spdk/barrier.h 00:03:40.321 TEST_HEADER include/spdk/accel_module.h 00:03:40.321 TEST_HEADER include/spdk/assert.h 00:03:40.321 TEST_HEADER include/spdk/bdev.h 00:03:40.321 TEST_HEADER include/spdk/bdev_module.h 00:03:40.321 CC app/trace_record/trace_record.o 00:03:40.321 TEST_HEADER include/spdk/base64.h 00:03:40.321 TEST_HEADER include/spdk/bit_array.h 00:03:40.321 TEST_HEADER include/spdk/bdev_zone.h 00:03:40.321 TEST_HEADER include/spdk/bit_pool.h 00:03:40.321 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:40.321 TEST_HEADER include/spdk/blobfs.h 00:03:40.321 CC app/spdk_nvme_identify/identify.o 00:03:40.321 TEST_HEADER include/spdk/blob_bdev.h 00:03:40.321 TEST_HEADER include/spdk/config.h 00:03:40.321 TEST_HEADER include/spdk/crc16.h 00:03:40.321 TEST_HEADER include/spdk/cpuset.h 00:03:40.321 TEST_HEADER include/spdk/blob.h 00:03:40.321 TEST_HEADER include/spdk/conf.h 00:03:40.321 TEST_HEADER include/spdk/crc32.h 00:03:40.321 TEST_HEADER include/spdk/crc64.h 00:03:40.321 TEST_HEADER include/spdk/dif.h 00:03:40.321 TEST_HEADER include/spdk/dma.h 00:03:40.321 CC test/rpc_client/rpc_client_test.o 00:03:40.321 TEST_HEADER include/spdk/endian.h 00:03:40.321 TEST_HEADER include/spdk/event.h 00:03:40.321 TEST_HEADER include/spdk/env.h 00:03:40.321 TEST_HEADER include/spdk/fd_group.h 00:03:40.321 TEST_HEADER include/spdk/fd.h 00:03:40.321 TEST_HEADER include/spdk/env_dpdk.h 00:03:40.321 TEST_HEADER include/spdk/file.h 00:03:40.321 TEST_HEADER include/spdk/fsdev.h 00:03:40.321 CC app/spdk_nvme_discover/discovery_aer.o 00:03:40.321 TEST_HEADER include/spdk/fsdev_module.h 00:03:40.321 TEST_HEADER include/spdk/fuse_dispatcher.h 00:03:40.321 TEST_HEADER include/spdk/gpt_spec.h 00:03:40.321 TEST_HEADER include/spdk/hexlify.h 00:03:40.321 TEST_HEADER include/spdk/ftl.h 00:03:40.321 CC app/spdk_lspci/spdk_lspci.o 00:03:40.321 TEST_HEADER include/spdk/idxd_spec.h 00:03:40.321 TEST_HEADER include/spdk/idxd.h 00:03:40.321 TEST_HEADER include/spdk/histogram_data.h 00:03:40.321 TEST_HEADER include/spdk/init.h 00:03:40.321 TEST_HEADER include/spdk/ioat_spec.h 00:03:40.321 TEST_HEADER include/spdk/ioat.h 00:03:40.321 TEST_HEADER include/spdk/iscsi_spec.h 00:03:40.321 TEST_HEADER include/spdk/json.h 00:03:40.321 CC app/spdk_top/spdk_top.o 00:03:40.321 TEST_HEADER include/spdk/jsonrpc.h 00:03:40.321 TEST_HEADER include/spdk/keyring.h 00:03:40.321 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:40.321 TEST_HEADER include/spdk/keyring_module.h 00:03:40.321 TEST_HEADER include/spdk/likely.h 00:03:40.321 TEST_HEADER include/spdk/lvol.h 00:03:40.321 TEST_HEADER include/spdk/log.h 00:03:40.321 TEST_HEADER include/spdk/md5.h 00:03:40.321 TEST_HEADER include/spdk/memory.h 00:03:40.321 TEST_HEADER include/spdk/mmio.h 00:03:40.321 TEST_HEADER include/spdk/nbd.h 00:03:40.321 TEST_HEADER include/spdk/notify.h 00:03:40.321 TEST_HEADER include/spdk/net.h 00:03:40.321 TEST_HEADER include/spdk/nvme.h 00:03:40.321 TEST_HEADER include/spdk/nvme_intel.h 00:03:40.321 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:40.321 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:40.321 TEST_HEADER include/spdk/nvme_spec.h 00:03:40.321 TEST_HEADER include/spdk/nvme_zns.h 00:03:40.321 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:40.321 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:40.321 TEST_HEADER include/spdk/nvmf.h 00:03:40.321 TEST_HEADER include/spdk/nvmf_spec.h 00:03:40.321 TEST_HEADER include/spdk/nvmf_transport.h 00:03:40.321 TEST_HEADER include/spdk/opal.h 00:03:40.321 TEST_HEADER include/spdk/opal_spec.h 00:03:40.321 TEST_HEADER include/spdk/pci_ids.h 00:03:40.321 TEST_HEADER include/spdk/pipe.h 00:03:40.321 TEST_HEADER include/spdk/queue.h 00:03:40.321 TEST_HEADER include/spdk/reduce.h 00:03:40.321 TEST_HEADER include/spdk/scheduler.h 00:03:40.321 TEST_HEADER include/spdk/rpc.h 00:03:40.321 TEST_HEADER include/spdk/scsi.h 00:03:40.321 TEST_HEADER include/spdk/scsi_spec.h 00:03:40.321 TEST_HEADER include/spdk/sock.h 00:03:40.321 TEST_HEADER include/spdk/stdinc.h 00:03:40.321 TEST_HEADER include/spdk/string.h 00:03:40.321 TEST_HEADER include/spdk/thread.h 00:03:40.321 CC app/iscsi_tgt/iscsi_tgt.o 00:03:40.321 TEST_HEADER include/spdk/trace.h 00:03:40.321 TEST_HEADER include/spdk/trace_parser.h 00:03:40.321 TEST_HEADER include/spdk/tree.h 00:03:40.321 TEST_HEADER include/spdk/ublk.h 00:03:40.321 TEST_HEADER include/spdk/util.h 00:03:40.321 TEST_HEADER include/spdk/uuid.h 00:03:40.585 TEST_HEADER include/spdk/version.h 00:03:40.585 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:40.585 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:40.585 TEST_HEADER include/spdk/vhost.h 00:03:40.585 TEST_HEADER include/spdk/vmd.h 00:03:40.585 TEST_HEADER include/spdk/xor.h 00:03:40.585 TEST_HEADER include/spdk/zipf.h 00:03:40.585 CXX test/cpp_headers/accel.o 00:03:40.585 CXX test/cpp_headers/accel_module.o 00:03:40.585 CXX test/cpp_headers/assert.o 00:03:40.585 CXX test/cpp_headers/barrier.o 00:03:40.585 CXX test/cpp_headers/base64.o 00:03:40.585 CXX test/cpp_headers/bdev.o 00:03:40.585 CXX test/cpp_headers/bdev_module.o 00:03:40.585 CXX test/cpp_headers/bdev_zone.o 00:03:40.586 CXX test/cpp_headers/bit_pool.o 00:03:40.586 CXX test/cpp_headers/bit_array.o 00:03:40.586 CXX test/cpp_headers/blob_bdev.o 00:03:40.586 CXX test/cpp_headers/blobfs_bdev.o 00:03:40.586 CXX test/cpp_headers/blobfs.o 00:03:40.586 CC app/spdk_dd/spdk_dd.o 00:03:40.586 CXX test/cpp_headers/blob.o 00:03:40.586 CXX test/cpp_headers/conf.o 00:03:40.586 CXX test/cpp_headers/config.o 00:03:40.586 CXX test/cpp_headers/crc32.o 00:03:40.586 CXX test/cpp_headers/crc16.o 00:03:40.586 CXX test/cpp_headers/cpuset.o 00:03:40.586 CXX test/cpp_headers/crc64.o 00:03:40.586 CXX test/cpp_headers/dif.o 00:03:40.586 CXX test/cpp_headers/dma.o 00:03:40.586 CXX test/cpp_headers/endian.o 00:03:40.586 CXX test/cpp_headers/env.o 00:03:40.586 CXX test/cpp_headers/env_dpdk.o 00:03:40.586 CXX test/cpp_headers/event.o 00:03:40.586 CXX test/cpp_headers/fd_group.o 00:03:40.586 CXX test/cpp_headers/fd.o 00:03:40.586 CXX test/cpp_headers/file.o 00:03:40.586 CXX test/cpp_headers/fsdev_module.o 00:03:40.586 CXX test/cpp_headers/fsdev.o 00:03:40.586 CXX test/cpp_headers/ftl.o 00:03:40.586 CXX test/cpp_headers/fuse_dispatcher.o 00:03:40.586 CXX test/cpp_headers/gpt_spec.o 00:03:40.586 CXX test/cpp_headers/hexlify.o 00:03:40.586 CXX test/cpp_headers/histogram_data.o 00:03:40.586 CXX test/cpp_headers/idxd.o 00:03:40.586 CXX test/cpp_headers/idxd_spec.o 00:03:40.586 CXX test/cpp_headers/init.o 00:03:40.586 CXX test/cpp_headers/ioat.o 00:03:40.586 CXX test/cpp_headers/ioat_spec.o 00:03:40.586 CC examples/util/zipf/zipf.o 00:03:40.586 CC app/nvmf_tgt/nvmf_main.o 00:03:40.586 CC app/spdk_tgt/spdk_tgt.o 00:03:40.586 CC test/app/jsoncat/jsoncat.o 00:03:40.586 CC examples/ioat/perf/perf.o 00:03:40.586 CC examples/ioat/verify/verify.o 00:03:40.586 CC test/thread/lock/spdk_lock.o 00:03:40.586 CC test/app/stub/stub.o 00:03:40.586 CC test/app/histogram_perf/histogram_perf.o 00:03:40.586 CC test/thread/poller_perf/poller_perf.o 00:03:40.586 CC test/env/memory/memory_ut.o 00:03:40.586 CC test/env/vtophys/vtophys.o 00:03:40.586 CC app/fio/nvme/fio_plugin.o 00:03:40.586 CC test/env/pci/pci_ut.o 00:03:40.586 CXX test/cpp_headers/iscsi_spec.o 00:03:40.586 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:40.586 CC test/dma/test_dma/test_dma.o 00:03:40.586 CC test/app/bdev_svc/bdev_svc.o 00:03:40.586 LINK spdk_lspci 00:03:40.586 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:40.586 CC app/fio/bdev/fio_plugin.o 00:03:40.586 CC test/env/mem_callbacks/mem_callbacks.o 00:03:40.586 LINK spdk_nvme_discover 00:03:40.586 LINK rpc_client_test 00:03:40.586 LINK jsoncat 00:03:40.586 LINK spdk_trace_record 00:03:40.586 CXX test/cpp_headers/json.o 00:03:40.586 LINK zipf 00:03:40.586 CXX test/cpp_headers/jsonrpc.o 00:03:40.586 CXX test/cpp_headers/keyring.o 00:03:40.586 CXX test/cpp_headers/keyring_module.o 00:03:40.586 CXX test/cpp_headers/likely.o 00:03:40.586 LINK histogram_perf 00:03:40.586 CXX test/cpp_headers/log.o 00:03:40.586 CXX test/cpp_headers/lvol.o 00:03:40.586 LINK vtophys 00:03:40.586 CXX test/cpp_headers/md5.o 00:03:40.586 CXX test/cpp_headers/memory.o 00:03:40.586 CXX test/cpp_headers/mmio.o 00:03:40.586 CXX test/cpp_headers/nbd.o 00:03:40.586 CXX test/cpp_headers/net.o 00:03:40.586 LINK poller_perf 00:03:40.586 CXX test/cpp_headers/notify.o 00:03:40.586 CXX test/cpp_headers/nvme.o 00:03:40.586 CXX test/cpp_headers/nvme_intel.o 00:03:40.586 CXX test/cpp_headers/nvme_ocssd.o 00:03:40.586 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:40.586 LINK interrupt_tgt 00:03:40.586 CXX test/cpp_headers/nvme_spec.o 00:03:40.586 CXX test/cpp_headers/nvme_zns.o 00:03:40.586 CXX test/cpp_headers/nvmf_cmd.o 00:03:40.586 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:40.586 CXX test/cpp_headers/nvmf.o 00:03:40.586 CXX test/cpp_headers/nvmf_spec.o 00:03:40.586 CXX test/cpp_headers/nvmf_transport.o 00:03:40.586 CXX test/cpp_headers/opal.o 00:03:40.586 CXX test/cpp_headers/opal_spec.o 00:03:40.586 CXX test/cpp_headers/pci_ids.o 00:03:40.586 CXX test/cpp_headers/pipe.o 00:03:40.586 CXX test/cpp_headers/queue.o 00:03:40.844 CXX test/cpp_headers/reduce.o 00:03:40.844 CXX test/cpp_headers/rpc.o 00:03:40.844 CXX test/cpp_headers/scheduler.o 00:03:40.844 CXX test/cpp_headers/scsi.o 00:03:40.844 CXX test/cpp_headers/scsi_spec.o 00:03:40.844 CXX test/cpp_headers/sock.o 00:03:40.844 CXX test/cpp_headers/stdinc.o 00:03:40.844 CXX test/cpp_headers/string.o 00:03:40.844 CXX test/cpp_headers/thread.o 00:03:40.844 LINK env_dpdk_post_init 00:03:40.844 LINK stub 00:03:40.844 LINK verify 00:03:40.844 LINK nvmf_tgt 00:03:40.844 LINK iscsi_tgt 00:03:40.844 CXX test/cpp_headers/trace.o 00:03:40.844 LINK ioat_perf 00:03:40.844 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:40.844 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:40.844 LINK spdk_tgt 00:03:40.844 LINK bdev_svc 00:03:40.844 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:40.844 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:40.844 LINK spdk_trace 00:03:40.844 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:40.844 CXX test/cpp_headers/trace_parser.o 00:03:40.844 CXX test/cpp_headers/tree.o 00:03:40.844 CXX test/cpp_headers/ublk.o 00:03:40.844 CXX test/cpp_headers/util.o 00:03:40.844 CXX test/cpp_headers/uuid.o 00:03:40.844 CXX test/cpp_headers/version.o 00:03:40.844 CXX test/cpp_headers/vfio_user_pci.o 00:03:40.844 CXX test/cpp_headers/vfio_user_spec.o 00:03:40.844 CXX test/cpp_headers/vhost.o 00:03:40.844 CXX test/cpp_headers/vmd.o 00:03:40.844 CXX test/cpp_headers/xor.o 00:03:40.844 CXX test/cpp_headers/zipf.o 00:03:41.103 LINK spdk_dd 00:03:41.103 LINK pci_ut 00:03:41.103 LINK spdk_nvme 00:03:41.103 LINK nvme_fuzz 00:03:41.103 LINK test_dma 00:03:41.103 LINK spdk_nvme_identify 00:03:41.103 LINK spdk_bdev 00:03:41.103 LINK mem_callbacks 00:03:41.103 LINK llvm_vfio_fuzz 00:03:41.103 LINK spdk_nvme_perf 00:03:41.361 LINK vhost_fuzz 00:03:41.361 LINK spdk_top 00:03:41.361 CC examples/idxd/perf/perf.o 00:03:41.361 CC examples/sock/hello_world/hello_sock.o 00:03:41.361 CC examples/vmd/led/led.o 00:03:41.361 CC examples/vmd/lsvmd/lsvmd.o 00:03:41.361 CC examples/thread/thread/thread_ex.o 00:03:41.361 CC app/vhost/vhost.o 00:03:41.361 LINK llvm_nvme_fuzz 00:03:41.618 LINK led 00:03:41.618 LINK lsvmd 00:03:41.618 LINK memory_ut 00:03:41.618 LINK hello_sock 00:03:41.618 LINK vhost 00:03:41.618 LINK idxd_perf 00:03:41.618 LINK thread 00:03:41.875 LINK spdk_lock 00:03:42.131 LINK iscsi_fuzz 00:03:42.389 CC examples/nvme/arbitration/arbitration.o 00:03:42.389 CC examples/nvme/reconnect/reconnect.o 00:03:42.389 CC examples/nvme/hello_world/hello_world.o 00:03:42.389 CC examples/nvme/hotplug/hotplug.o 00:03:42.389 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:42.389 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:42.389 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:42.389 CC examples/nvme/abort/abort.o 00:03:42.389 CC test/event/event_perf/event_perf.o 00:03:42.389 CC test/event/reactor/reactor.o 00:03:42.389 CC test/event/reactor_perf/reactor_perf.o 00:03:42.389 CC test/event/app_repeat/app_repeat.o 00:03:42.389 CC test/event/scheduler/scheduler.o 00:03:42.647 LINK pmr_persistence 00:03:42.647 LINK cmb_copy 00:03:42.647 LINK hotplug 00:03:42.647 LINK hello_world 00:03:42.647 LINK event_perf 00:03:42.647 LINK reactor 00:03:42.647 LINK reactor_perf 00:03:42.647 LINK reconnect 00:03:42.647 LINK app_repeat 00:03:42.647 LINK abort 00:03:42.647 LINK arbitration 00:03:42.647 LINK nvme_manage 00:03:42.647 LINK scheduler 00:03:42.905 CC test/nvme/e2edp/nvme_dp.o 00:03:42.905 CC test/nvme/err_injection/err_injection.o 00:03:42.905 CC test/nvme/compliance/nvme_compliance.o 00:03:42.905 CC test/nvme/reserve/reserve.o 00:03:42.905 CC test/nvme/reset/reset.o 00:03:42.905 CC test/nvme/fused_ordering/fused_ordering.o 00:03:42.905 CC test/nvme/sgl/sgl.o 00:03:42.905 CC test/nvme/simple_copy/simple_copy.o 00:03:42.905 CC test/nvme/connect_stress/connect_stress.o 00:03:42.905 CC test/nvme/boot_partition/boot_partition.o 00:03:42.905 CC test/nvme/overhead/overhead.o 00:03:42.905 CC test/nvme/aer/aer.o 00:03:42.905 CC test/nvme/cuse/cuse.o 00:03:42.905 CC test/nvme/startup/startup.o 00:03:42.905 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:42.905 CC test/nvme/fdp/fdp.o 00:03:42.905 CC test/blobfs/mkfs/mkfs.o 00:03:42.905 CC test/accel/dif/dif.o 00:03:42.905 CC test/lvol/esnap/esnap.o 00:03:42.905 LINK boot_partition 00:03:42.905 LINK startup 00:03:42.905 LINK err_injection 00:03:42.905 LINK connect_stress 00:03:42.905 LINK fused_ordering 00:03:42.905 LINK reserve 00:03:42.905 LINK doorbell_aers 00:03:42.905 LINK simple_copy 00:03:42.905 LINK nvme_dp 00:03:42.905 LINK reset 00:03:42.905 LINK aer 00:03:42.905 LINK sgl 00:03:42.905 LINK overhead 00:03:42.905 LINK mkfs 00:03:43.163 LINK fdp 00:03:43.163 LINK nvme_compliance 00:03:43.421 LINK dif 00:03:43.421 CC examples/accel/perf/accel_perf.o 00:03:43.421 CC examples/blob/cli/blobcli.o 00:03:43.421 CC examples/blob/hello_world/hello_blob.o 00:03:43.421 CC examples/fsdev/hello_world/hello_fsdev.o 00:03:43.679 LINK hello_blob 00:03:43.679 LINK hello_fsdev 00:03:43.679 LINK accel_perf 00:03:43.679 LINK cuse 00:03:43.679 LINK blobcli 00:03:44.612 CC examples/bdev/bdevperf/bdevperf.o 00:03:44.612 CC examples/bdev/hello_world/hello_bdev.o 00:03:44.612 LINK hello_bdev 00:03:44.870 CC test/bdev/bdevio/bdevio.o 00:03:44.870 LINK bdevperf 00:03:45.128 LINK bdevio 00:03:46.505 LINK esnap 00:03:46.505 CC examples/nvmf/nvmf/nvmf.o 00:03:46.763 LINK nvmf 00:03:48.140 00:03:48.140 real 0m38.242s 00:03:48.140 user 5m8.255s 00:03:48.140 sys 1m35.342s 00:03:48.140 15:05:26 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:03:48.140 15:05:26 make -- common/autotest_common.sh@10 -- $ set +x 00:03:48.140 ************************************ 00:03:48.140 END TEST make 00:03:48.140 ************************************ 00:03:48.140 15:05:26 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:48.140 15:05:26 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:48.140 15:05:26 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:48.140 15:05:26 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:48.140 15:05:26 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:48.140 15:05:26 -- pm/common@44 -- $ pid=1334479 00:03:48.140 15:05:26 -- pm/common@50 -- $ kill -TERM 1334479 00:03:48.140 15:05:26 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:48.140 15:05:26 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:48.140 15:05:26 -- pm/common@44 -- $ pid=1334481 00:03:48.140 15:05:26 -- pm/common@50 -- $ kill -TERM 1334481 00:03:48.140 15:05:26 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:48.140 15:05:26 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:48.140 15:05:26 -- pm/common@44 -- $ pid=1334483 00:03:48.140 15:05:26 -- pm/common@50 -- $ kill -TERM 1334483 00:03:48.140 15:05:26 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:48.140 15:05:26 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:48.140 15:05:26 -- pm/common@44 -- $ pid=1334509 00:03:48.140 15:05:26 -- pm/common@50 -- $ sudo -E kill -TERM 1334509 00:03:48.140 15:05:26 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:03:48.140 15:05:26 -- spdk/autorun.sh@27 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:03:48.140 15:05:26 -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:03:48.140 15:05:26 -- common/autotest_common.sh@1693 -- # lcov --version 00:03:48.140 15:05:26 -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:03:48.399 15:05:26 -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:03:48.399 15:05:26 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:48.399 15:05:26 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:48.399 15:05:26 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:48.399 15:05:26 -- scripts/common.sh@336 -- # IFS=.-: 00:03:48.400 15:05:26 -- scripts/common.sh@336 -- # read -ra ver1 00:03:48.400 15:05:26 -- scripts/common.sh@337 -- # IFS=.-: 00:03:48.400 15:05:26 -- scripts/common.sh@337 -- # read -ra ver2 00:03:48.400 15:05:26 -- scripts/common.sh@338 -- # local 'op=<' 00:03:48.400 15:05:26 -- scripts/common.sh@340 -- # ver1_l=2 00:03:48.400 15:05:26 -- scripts/common.sh@341 -- # ver2_l=1 00:03:48.400 15:05:26 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:48.400 15:05:26 -- scripts/common.sh@344 -- # case "$op" in 00:03:48.400 15:05:26 -- scripts/common.sh@345 -- # : 1 00:03:48.400 15:05:26 -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:48.400 15:05:26 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:48.400 15:05:26 -- scripts/common.sh@365 -- # decimal 1 00:03:48.400 15:05:26 -- scripts/common.sh@353 -- # local d=1 00:03:48.400 15:05:26 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:48.400 15:05:26 -- scripts/common.sh@355 -- # echo 1 00:03:48.400 15:05:26 -- scripts/common.sh@365 -- # ver1[v]=1 00:03:48.400 15:05:26 -- scripts/common.sh@366 -- # decimal 2 00:03:48.400 15:05:26 -- scripts/common.sh@353 -- # local d=2 00:03:48.400 15:05:26 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:48.400 15:05:26 -- scripts/common.sh@355 -- # echo 2 00:03:48.400 15:05:26 -- scripts/common.sh@366 -- # ver2[v]=2 00:03:48.400 15:05:26 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:48.400 15:05:26 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:48.400 15:05:26 -- scripts/common.sh@368 -- # return 0 00:03:48.400 15:05:26 -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:48.400 15:05:26 -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:03:48.400 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:48.400 --rc genhtml_branch_coverage=1 00:03:48.400 --rc genhtml_function_coverage=1 00:03:48.400 --rc genhtml_legend=1 00:03:48.400 --rc geninfo_all_blocks=1 00:03:48.400 --rc geninfo_unexecuted_blocks=1 00:03:48.400 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:48.400 ' 00:03:48.400 15:05:26 -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:03:48.400 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:48.400 --rc genhtml_branch_coverage=1 00:03:48.400 --rc genhtml_function_coverage=1 00:03:48.400 --rc genhtml_legend=1 00:03:48.400 --rc geninfo_all_blocks=1 00:03:48.400 --rc geninfo_unexecuted_blocks=1 00:03:48.400 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:48.400 ' 00:03:48.400 15:05:26 -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:03:48.400 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:48.400 --rc genhtml_branch_coverage=1 00:03:48.400 --rc genhtml_function_coverage=1 00:03:48.400 --rc genhtml_legend=1 00:03:48.400 --rc geninfo_all_blocks=1 00:03:48.400 --rc geninfo_unexecuted_blocks=1 00:03:48.400 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:48.400 ' 00:03:48.400 15:05:26 -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:03:48.400 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:48.400 --rc genhtml_branch_coverage=1 00:03:48.400 --rc genhtml_function_coverage=1 00:03:48.400 --rc genhtml_legend=1 00:03:48.400 --rc geninfo_all_blocks=1 00:03:48.400 --rc geninfo_unexecuted_blocks=1 00:03:48.400 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:48.400 ' 00:03:48.400 15:05:26 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:48.400 15:05:26 -- nvmf/common.sh@7 -- # uname -s 00:03:48.400 15:05:26 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:48.400 15:05:26 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:48.400 15:05:26 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:48.400 15:05:26 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:48.400 15:05:26 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:48.400 15:05:26 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:48.400 15:05:26 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:48.400 15:05:26 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:48.400 15:05:26 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:48.400 15:05:26 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:48.400 15:05:26 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:03:48.400 15:05:26 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:03:48.400 15:05:26 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:48.400 15:05:26 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:48.400 15:05:26 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:48.400 15:05:26 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:48.400 15:05:26 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:48.400 15:05:26 -- scripts/common.sh@15 -- # shopt -s extglob 00:03:48.400 15:05:26 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:48.400 15:05:26 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:48.400 15:05:26 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:48.400 15:05:26 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:48.400 15:05:26 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:48.400 15:05:26 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:48.400 15:05:26 -- paths/export.sh@5 -- # export PATH 00:03:48.400 15:05:26 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:48.400 15:05:26 -- nvmf/common.sh@51 -- # : 0 00:03:48.400 15:05:26 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:03:48.400 15:05:26 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:03:48.400 15:05:26 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:48.400 15:05:26 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:48.400 15:05:26 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:48.400 15:05:26 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:03:48.400 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:03:48.400 15:05:26 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:03:48.400 15:05:26 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:03:48.400 15:05:26 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:03:48.400 15:05:26 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:48.400 15:05:26 -- spdk/autotest.sh@32 -- # uname -s 00:03:48.400 15:05:26 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:48.400 15:05:26 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:48.400 15:05:26 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:48.400 15:05:26 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:48.400 15:05:26 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:48.400 15:05:26 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:48.400 15:05:26 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:48.400 15:05:26 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:48.400 15:05:26 -- spdk/autotest.sh@48 -- # udevadm_pid=1408464 00:03:48.400 15:05:26 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:48.400 15:05:26 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:48.400 15:05:26 -- pm/common@17 -- # local monitor 00:03:48.400 15:05:26 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:48.400 15:05:26 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:48.400 15:05:26 -- pm/common@21 -- # date +%s 00:03:48.400 15:05:26 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:48.400 15:05:26 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:48.400 15:05:26 -- pm/common@21 -- # date +%s 00:03:48.400 15:05:26 -- pm/common@25 -- # sleep 1 00:03:48.401 15:05:26 -- pm/common@21 -- # date +%s 00:03:48.401 15:05:26 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732111526 00:03:48.401 15:05:26 -- pm/common@21 -- # date +%s 00:03:48.401 15:05:26 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732111526 00:03:48.401 15:05:26 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732111526 00:03:48.401 15:05:26 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732111526 00:03:48.401 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732111526_collect-cpu-load.pm.log 00:03:48.401 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732111526_collect-vmstat.pm.log 00:03:48.401 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732111526_collect-cpu-temp.pm.log 00:03:48.401 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732111526_collect-bmc-pm.bmc.pm.log 00:03:49.334 15:05:27 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:49.334 15:05:27 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:49.334 15:05:27 -- common/autotest_common.sh@726 -- # xtrace_disable 00:03:49.334 15:05:27 -- common/autotest_common.sh@10 -- # set +x 00:03:49.334 15:05:27 -- spdk/autotest.sh@59 -- # create_test_list 00:03:49.334 15:05:27 -- common/autotest_common.sh@752 -- # xtrace_disable 00:03:49.334 15:05:27 -- common/autotest_common.sh@10 -- # set +x 00:03:49.334 15:05:27 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:49.334 15:05:28 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:49.334 15:05:28 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:49.334 15:05:28 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:49.334 15:05:28 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:49.334 15:05:28 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:49.334 15:05:28 -- common/autotest_common.sh@1457 -- # uname 00:03:49.334 15:05:28 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:03:49.334 15:05:28 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:49.593 15:05:28 -- common/autotest_common.sh@1477 -- # uname 00:03:49.593 15:05:28 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:03:49.593 15:05:28 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:03:49.593 15:05:28 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:49.593 lcov: LCOV version 1.15 00:03:49.593 15:05:28 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:56.152 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:01.420 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:03.949 15:05:42 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:03.949 15:05:42 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:03.949 15:05:42 -- common/autotest_common.sh@10 -- # set +x 00:04:03.949 15:05:42 -- spdk/autotest.sh@78 -- # rm -f 00:04:03.949 15:05:42 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:07.231 0000:1a:00.0 (8086 0a54): Already using the nvme driver 00:04:07.231 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:07.231 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:07.231 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:07.231 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:07.489 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:07.746 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:07.746 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:07.746 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:10.280 15:05:48 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:10.280 15:05:48 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:10.280 15:05:48 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:10.280 15:05:48 -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:10.280 15:05:48 -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:10.280 15:05:48 -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:10.280 15:05:48 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:10.280 15:05:48 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:10.280 15:05:48 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:10.280 15:05:48 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:10.280 15:05:48 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:10.280 15:05:48 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:10.280 15:05:48 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:10.280 15:05:48 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:10.280 15:05:48 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:10.280 No valid GPT data, bailing 00:04:10.280 15:05:48 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:10.280 15:05:48 -- scripts/common.sh@394 -- # pt= 00:04:10.280 15:05:48 -- scripts/common.sh@395 -- # return 1 00:04:10.280 15:05:48 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:10.280 1+0 records in 00:04:10.280 1+0 records out 00:04:10.280 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00146754 s, 715 MB/s 00:04:10.280 15:05:48 -- spdk/autotest.sh@105 -- # sync 00:04:10.280 15:05:48 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:10.280 15:05:48 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:10.280 15:05:48 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:15.550 15:05:54 -- spdk/autotest.sh@111 -- # uname -s 00:04:15.550 15:05:54 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:15.550 15:05:54 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:15.550 15:05:54 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:15.550 15:05:54 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:15.550 15:05:54 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:15.550 15:05:54 -- common/autotest_common.sh@10 -- # set +x 00:04:15.550 ************************************ 00:04:15.550 START TEST setup.sh 00:04:15.550 ************************************ 00:04:15.550 15:05:54 setup.sh -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:15.809 * Looking for test storage... 00:04:15.809 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1693 -- # lcov --version 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:15.809 15:05:54 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:15.809 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:15.809 --rc genhtml_branch_coverage=1 00:04:15.809 --rc genhtml_function_coverage=1 00:04:15.809 --rc genhtml_legend=1 00:04:15.809 --rc geninfo_all_blocks=1 00:04:15.809 --rc geninfo_unexecuted_blocks=1 00:04:15.809 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:15.809 ' 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:15.809 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:15.809 --rc genhtml_branch_coverage=1 00:04:15.809 --rc genhtml_function_coverage=1 00:04:15.809 --rc genhtml_legend=1 00:04:15.809 --rc geninfo_all_blocks=1 00:04:15.809 --rc geninfo_unexecuted_blocks=1 00:04:15.809 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:15.809 ' 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:15.809 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:15.809 --rc genhtml_branch_coverage=1 00:04:15.809 --rc genhtml_function_coverage=1 00:04:15.809 --rc genhtml_legend=1 00:04:15.809 --rc geninfo_all_blocks=1 00:04:15.809 --rc geninfo_unexecuted_blocks=1 00:04:15.809 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:15.809 ' 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:15.809 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:15.809 --rc genhtml_branch_coverage=1 00:04:15.809 --rc genhtml_function_coverage=1 00:04:15.809 --rc genhtml_legend=1 00:04:15.809 --rc geninfo_all_blocks=1 00:04:15.809 --rc geninfo_unexecuted_blocks=1 00:04:15.809 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:15.809 ' 00:04:15.809 15:05:54 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:15.809 15:05:54 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:15.809 15:05:54 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:15.809 15:05:54 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:15.810 ************************************ 00:04:15.810 START TEST acl 00:04:15.810 ************************************ 00:04:15.810 15:05:54 setup.sh.acl -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:16.068 * Looking for test storage... 00:04:16.068 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:16.068 15:05:54 setup.sh.acl -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:16.068 15:05:54 setup.sh.acl -- common/autotest_common.sh@1693 -- # lcov --version 00:04:16.068 15:05:54 setup.sh.acl -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:16.068 15:05:54 setup.sh.acl -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:16.068 15:05:54 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:16.068 15:05:54 setup.sh.acl -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:16.068 15:05:54 setup.sh.acl -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:16.068 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.068 --rc genhtml_branch_coverage=1 00:04:16.068 --rc genhtml_function_coverage=1 00:04:16.068 --rc genhtml_legend=1 00:04:16.068 --rc geninfo_all_blocks=1 00:04:16.068 --rc geninfo_unexecuted_blocks=1 00:04:16.069 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:16.069 ' 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:16.069 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.069 --rc genhtml_branch_coverage=1 00:04:16.069 --rc genhtml_function_coverage=1 00:04:16.069 --rc genhtml_legend=1 00:04:16.069 --rc geninfo_all_blocks=1 00:04:16.069 --rc geninfo_unexecuted_blocks=1 00:04:16.069 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:16.069 ' 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:16.069 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.069 --rc genhtml_branch_coverage=1 00:04:16.069 --rc genhtml_function_coverage=1 00:04:16.069 --rc genhtml_legend=1 00:04:16.069 --rc geninfo_all_blocks=1 00:04:16.069 --rc geninfo_unexecuted_blocks=1 00:04:16.069 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:16.069 ' 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:16.069 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.069 --rc genhtml_branch_coverage=1 00:04:16.069 --rc genhtml_function_coverage=1 00:04:16.069 --rc genhtml_legend=1 00:04:16.069 --rc geninfo_all_blocks=1 00:04:16.069 --rc geninfo_unexecuted_blocks=1 00:04:16.069 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:16.069 ' 00:04:16.069 15:05:54 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:16.069 15:05:54 setup.sh.acl -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:16.069 15:05:54 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:16.069 15:05:54 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:16.069 15:05:54 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:16.069 15:05:54 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:16.069 15:05:54 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:16.069 15:05:54 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:16.069 15:05:54 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:22.631 15:06:00 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:22.631 15:06:00 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:22.631 15:06:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:22.631 15:06:00 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:22.631 15:06:00 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.631 15:06:00 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:25.920 Hugepages 00:04:25.920 node hugesize free / total 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 00:04:25.920 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.920 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:1a:00.0 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:25.921 15:06:04 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:25.921 15:06:04 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:25.921 15:06:04 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:25.921 15:06:04 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:25.921 ************************************ 00:04:25.921 START TEST denied 00:04:25.921 ************************************ 00:04:25.921 15:06:04 setup.sh.acl.denied -- common/autotest_common.sh@1129 -- # denied 00:04:25.921 15:06:04 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:1a:00.0' 00:04:25.921 15:06:04 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:25.921 15:06:04 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:1a:00.0' 00:04:25.921 15:06:04 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:25.921 15:06:04 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:32.497 0000:1a:00.0 (8086 0a54): Skipping denied controller at 0000:1a:00.0 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:1a:00.0 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:1a:00.0 ]] 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:1a:00.0/driver 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:32.498 15:06:10 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:39.066 00:04:39.066 real 0m13.113s 00:04:39.066 user 0m3.888s 00:04:39.066 sys 0m8.376s 00:04:39.066 15:06:17 setup.sh.acl.denied -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:39.066 15:06:17 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:39.066 ************************************ 00:04:39.066 END TEST denied 00:04:39.066 ************************************ 00:04:39.066 15:06:17 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:39.066 15:06:17 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:39.066 15:06:17 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:39.066 15:06:17 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:39.325 ************************************ 00:04:39.325 START TEST allowed 00:04:39.325 ************************************ 00:04:39.325 15:06:17 setup.sh.acl.allowed -- common/autotest_common.sh@1129 -- # allowed 00:04:39.325 15:06:17 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:1a:00.0 00:04:39.325 15:06:17 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:39.325 15:06:17 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.325 15:06:17 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:39.325 15:06:17 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:1a:00.0 .*: nvme -> .*' 00:04:49.307 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:04:49.307 15:06:26 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:49.307 15:06:26 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:49.307 15:06:26 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:49.307 15:06:26 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:49.307 15:06:26 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:54.599 00:04:54.599 real 0m14.807s 00:04:54.599 user 0m3.602s 00:04:54.599 sys 0m7.840s 00:04:54.599 15:06:32 setup.sh.acl.allowed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:54.599 15:06:32 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:54.599 ************************************ 00:04:54.599 END TEST allowed 00:04:54.599 ************************************ 00:04:54.599 00:04:54.599 real 0m38.218s 00:04:54.599 user 0m10.848s 00:04:54.599 sys 0m23.326s 00:04:54.599 15:06:32 setup.sh.acl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:54.599 15:06:32 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:54.599 ************************************ 00:04:54.599 END TEST acl 00:04:54.599 ************************************ 00:04:54.599 15:06:32 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:54.599 15:06:32 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:54.599 15:06:32 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:54.599 15:06:32 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:54.599 ************************************ 00:04:54.599 START TEST hugepages 00:04:54.599 ************************************ 00:04:54.599 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:54.599 * Looking for test storage... 00:04:54.599 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:54.599 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:54.599 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lcov --version 00:04:54.599 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:54.599 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:04:54.599 15:06:32 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:54.600 15:06:32 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:04:54.600 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:54.600 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:54.600 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:54.600 --rc genhtml_branch_coverage=1 00:04:54.600 --rc genhtml_function_coverage=1 00:04:54.600 --rc genhtml_legend=1 00:04:54.600 --rc geninfo_all_blocks=1 00:04:54.600 --rc geninfo_unexecuted_blocks=1 00:04:54.600 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:54.600 ' 00:04:54.600 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:54.600 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:54.600 --rc genhtml_branch_coverage=1 00:04:54.600 --rc genhtml_function_coverage=1 00:04:54.600 --rc genhtml_legend=1 00:04:54.600 --rc geninfo_all_blocks=1 00:04:54.600 --rc geninfo_unexecuted_blocks=1 00:04:54.600 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:54.600 ' 00:04:54.600 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:54.600 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:54.600 --rc genhtml_branch_coverage=1 00:04:54.600 --rc genhtml_function_coverage=1 00:04:54.600 --rc genhtml_legend=1 00:04:54.600 --rc geninfo_all_blocks=1 00:04:54.600 --rc geninfo_unexecuted_blocks=1 00:04:54.600 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:54.600 ' 00:04:54.600 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:54.600 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:54.600 --rc genhtml_branch_coverage=1 00:04:54.600 --rc genhtml_function_coverage=1 00:04:54.600 --rc genhtml_legend=1 00:04:54.600 --rc geninfo_all_blocks=1 00:04:54.600 --rc geninfo_unexecuted_blocks=1 00:04:54.600 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:54.600 ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 69075932 kB' 'MemAvailable: 74103468 kB' 'Buffers: 10272 kB' 'Cached: 15075096 kB' 'SwapCached: 0 kB' 'Active: 11834232 kB' 'Inactive: 4123708 kB' 'Active(anon): 10632148 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 875840 kB' 'Mapped: 159724 kB' 'Shmem: 9759576 kB' 'KReclaimable: 537664 kB' 'Slab: 1164532 kB' 'SReclaimable: 537664 kB' 'SUnreclaim: 626868 kB' 'KernelStack: 17776 kB' 'PageTables: 9328 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52434164 kB' 'Committed_AS: 12013456 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215632 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.600 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.601 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:54.602 15:06:32 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:04:54.602 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:54.602 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:54.602 15:06:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:54.602 ************************************ 00:04:54.602 START TEST single_node_setup 00:04:54.602 ************************************ 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1129 -- # single_node_setup 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.602 15:06:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:58.790 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:58.790 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:01.325 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71214004 kB' 'MemAvailable: 76241260 kB' 'Buffers: 10272 kB' 'Cached: 15075276 kB' 'SwapCached: 0 kB' 'Active: 11835012 kB' 'Inactive: 4123708 kB' 'Active(anon): 10632928 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 876028 kB' 'Mapped: 159820 kB' 'Shmem: 9759756 kB' 'KReclaimable: 537384 kB' 'Slab: 1162852 kB' 'SReclaimable: 537384 kB' 'SUnreclaim: 625468 kB' 'KernelStack: 17616 kB' 'PageTables: 9256 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12013244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215472 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.865 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.866 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71214428 kB' 'MemAvailable: 76241588 kB' 'Buffers: 10272 kB' 'Cached: 15075292 kB' 'SwapCached: 0 kB' 'Active: 11834364 kB' 'Inactive: 4123708 kB' 'Active(anon): 10632280 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 875832 kB' 'Mapped: 159668 kB' 'Shmem: 9759772 kB' 'KReclaimable: 537288 kB' 'Slab: 1162736 kB' 'SReclaimable: 537288 kB' 'SUnreclaim: 625448 kB' 'KernelStack: 17616 kB' 'PageTables: 9236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12013264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215472 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.867 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71214888 kB' 'MemAvailable: 76242048 kB' 'Buffers: 10272 kB' 'Cached: 15075316 kB' 'SwapCached: 0 kB' 'Active: 11834400 kB' 'Inactive: 4123708 kB' 'Active(anon): 10632316 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 875832 kB' 'Mapped: 159668 kB' 'Shmem: 9759796 kB' 'KReclaimable: 537288 kB' 'Slab: 1162736 kB' 'SReclaimable: 537288 kB' 'SUnreclaim: 625448 kB' 'KernelStack: 17616 kB' 'PageTables: 9236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12013284 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215472 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.868 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.869 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:03.870 nr_hugepages=1024 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:03.870 resv_hugepages=0 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:03.870 surplus_hugepages=0 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:03.870 anon_hugepages=0 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:03.870 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71215392 kB' 'MemAvailable: 76242552 kB' 'Buffers: 10272 kB' 'Cached: 15075320 kB' 'SwapCached: 0 kB' 'Active: 11834784 kB' 'Inactive: 4123708 kB' 'Active(anon): 10632700 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 876244 kB' 'Mapped: 159668 kB' 'Shmem: 9759800 kB' 'KReclaimable: 537288 kB' 'Slab: 1162736 kB' 'SReclaimable: 537288 kB' 'SUnreclaim: 625448 kB' 'KernelStack: 17600 kB' 'PageTables: 9184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12012940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215440 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.871 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.872 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064864 kB' 'MemFree: 38884388 kB' 'MemUsed: 9180476 kB' 'SwapCached: 0 kB' 'Active: 5352016 kB' 'Inactive: 251764 kB' 'Active(anon): 4434544 kB' 'Inactive(anon): 0 kB' 'Active(file): 917472 kB' 'Inactive(file): 251764 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4849908 kB' 'Mapped: 98872 kB' 'AnonPages: 757064 kB' 'Shmem: 3680672 kB' 'KernelStack: 10680 kB' 'PageTables: 6372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 198200 kB' 'Slab: 563928 kB' 'SReclaimable: 198200 kB' 'SUnreclaim: 365728 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.873 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:03.874 node0=1024 expecting 1024 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:03.874 00:05:03.874 real 0m9.279s 00:05:03.874 user 0m2.061s 00:05:03.874 sys 0m4.139s 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:03.874 15:06:42 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:05:03.874 ************************************ 00:05:03.874 END TEST single_node_setup 00:05:03.874 ************************************ 00:05:03.874 15:06:42 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:05:03.874 15:06:42 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:03.874 15:06:42 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:03.874 15:06:42 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:03.874 ************************************ 00:05:03.874 START TEST even_2G_alloc 00:05:03.874 ************************************ 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1129 -- # even_2G_alloc 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.874 15:06:42 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:08.080 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:08.080 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:08.080 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71218652 kB' 'MemAvailable: 76245796 kB' 'Buffers: 10272 kB' 'Cached: 15075468 kB' 'SwapCached: 0 kB' 'Active: 11836832 kB' 'Inactive: 4123708 kB' 'Active(anon): 10634748 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 877368 kB' 'Mapped: 159036 kB' 'Shmem: 9759948 kB' 'KReclaimable: 537272 kB' 'Slab: 1163224 kB' 'SReclaimable: 537272 kB' 'SUnreclaim: 625952 kB' 'KernelStack: 17872 kB' 'PageTables: 9988 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12007460 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215728 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.990 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71219580 kB' 'MemAvailable: 76246724 kB' 'Buffers: 10272 kB' 'Cached: 15075468 kB' 'SwapCached: 0 kB' 'Active: 11836692 kB' 'Inactive: 4123708 kB' 'Active(anon): 10634608 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 877280 kB' 'Mapped: 158976 kB' 'Shmem: 9759948 kB' 'KReclaimable: 537272 kB' 'Slab: 1163108 kB' 'SReclaimable: 537272 kB' 'SUnreclaim: 625836 kB' 'KernelStack: 17680 kB' 'PageTables: 9240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12007476 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215696 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.991 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.992 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71220964 kB' 'MemAvailable: 76248108 kB' 'Buffers: 10272 kB' 'Cached: 15075492 kB' 'SwapCached: 0 kB' 'Active: 11835496 kB' 'Inactive: 4123708 kB' 'Active(anon): 10633412 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 876576 kB' 'Mapped: 158900 kB' 'Shmem: 9759972 kB' 'KReclaimable: 537272 kB' 'Slab: 1162948 kB' 'SReclaimable: 537272 kB' 'SUnreclaim: 625676 kB' 'KernelStack: 17472 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12004864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215584 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.993 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.994 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:09.995 nr_hugepages=1024 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:09.995 resv_hugepages=0 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:09.995 surplus_hugepages=0 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:09.995 anon_hugepages=0 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71220712 kB' 'MemAvailable: 76247856 kB' 'Buffers: 10272 kB' 'Cached: 15075512 kB' 'SwapCached: 0 kB' 'Active: 11835964 kB' 'Inactive: 4123708 kB' 'Active(anon): 10633880 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 877100 kB' 'Mapped: 158896 kB' 'Shmem: 9759992 kB' 'KReclaimable: 537272 kB' 'Slab: 1162788 kB' 'SReclaimable: 537272 kB' 'SUnreclaim: 625516 kB' 'KernelStack: 17584 kB' 'PageTables: 9060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12005976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215616 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.995 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.996 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064864 kB' 'MemFree: 39920336 kB' 'MemUsed: 8144528 kB' 'SwapCached: 0 kB' 'Active: 5358276 kB' 'Inactive: 251764 kB' 'Active(anon): 4440804 kB' 'Inactive(anon): 0 kB' 'Active(file): 917472 kB' 'Inactive(file): 251764 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4850012 kB' 'Mapped: 98344 kB' 'AnonPages: 763156 kB' 'Shmem: 3680776 kB' 'KernelStack: 10632 kB' 'PageTables: 6172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 198192 kB' 'Slab: 563888 kB' 'SReclaimable: 198192 kB' 'SUnreclaim: 365696 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.997 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.998 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44220560 kB' 'MemFree: 31293172 kB' 'MemUsed: 12927388 kB' 'SwapCached: 0 kB' 'Active: 6482800 kB' 'Inactive: 3871944 kB' 'Active(anon): 6198188 kB' 'Inactive(anon): 0 kB' 'Active(file): 284612 kB' 'Inactive(file): 3871944 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10235816 kB' 'Mapped: 60732 kB' 'AnonPages: 119008 kB' 'Shmem: 6079260 kB' 'KernelStack: 6936 kB' 'PageTables: 2836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 339080 kB' 'Slab: 598900 kB' 'SReclaimable: 339080 kB' 'SUnreclaim: 259820 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:09.999 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:10.000 node0=512 expecting 512 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:10.000 node1=512 expecting 512 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:10.000 00:05:10.000 real 0m6.231s 00:05:10.000 user 0m2.189s 00:05:10.000 sys 0m3.928s 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:10.000 15:06:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:10.000 ************************************ 00:05:10.000 END TEST even_2G_alloc 00:05:10.000 ************************************ 00:05:10.000 15:06:48 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:10.000 15:06:48 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:10.000 15:06:48 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:10.000 15:06:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:10.260 ************************************ 00:05:10.260 START TEST odd_alloc 00:05:10.260 ************************************ 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1129 -- # odd_alloc 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:10.260 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.261 15:06:48 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:14.450 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:14.450 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:14.450 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71222876 kB' 'MemAvailable: 76249956 kB' 'Buffers: 10272 kB' 'Cached: 15075680 kB' 'SwapCached: 0 kB' 'Active: 11837628 kB' 'Inactive: 4123708 kB' 'Active(anon): 10635544 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 878120 kB' 'Mapped: 159068 kB' 'Shmem: 9760160 kB' 'KReclaimable: 537208 kB' 'Slab: 1162308 kB' 'SReclaimable: 537208 kB' 'SUnreclaim: 625100 kB' 'KernelStack: 17584 kB' 'PageTables: 9008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53481716 kB' 'Committed_AS: 12005680 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215520 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.362 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.363 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71223600 kB' 'MemAvailable: 76250680 kB' 'Buffers: 10272 kB' 'Cached: 15075700 kB' 'SwapCached: 0 kB' 'Active: 11836764 kB' 'Inactive: 4123708 kB' 'Active(anon): 10634680 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 877744 kB' 'Mapped: 158964 kB' 'Shmem: 9760180 kB' 'KReclaimable: 537208 kB' 'Slab: 1162292 kB' 'SReclaimable: 537208 kB' 'SUnreclaim: 625084 kB' 'KernelStack: 17568 kB' 'PageTables: 8996 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53481716 kB' 'Committed_AS: 12005696 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215488 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.364 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71223644 kB' 'MemAvailable: 76250724 kB' 'Buffers: 10272 kB' 'Cached: 15075704 kB' 'SwapCached: 0 kB' 'Active: 11837204 kB' 'Inactive: 4123708 kB' 'Active(anon): 10635120 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 878180 kB' 'Mapped: 158964 kB' 'Shmem: 9760184 kB' 'KReclaimable: 537208 kB' 'Slab: 1162292 kB' 'SReclaimable: 537208 kB' 'SUnreclaim: 625084 kB' 'KernelStack: 17584 kB' 'PageTables: 9052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53481716 kB' 'Committed_AS: 12005716 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215488 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.365 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.366 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:16.367 nr_hugepages=1025 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:16.367 resv_hugepages=0 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:16.367 surplus_hugepages=0 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:16.367 anon_hugepages=0 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.367 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71224104 kB' 'MemAvailable: 76251184 kB' 'Buffers: 10272 kB' 'Cached: 15075724 kB' 'SwapCached: 0 kB' 'Active: 11837188 kB' 'Inactive: 4123708 kB' 'Active(anon): 10635104 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 878152 kB' 'Mapped: 158964 kB' 'Shmem: 9760204 kB' 'KReclaimable: 537208 kB' 'Slab: 1162292 kB' 'SReclaimable: 537208 kB' 'SUnreclaim: 625084 kB' 'KernelStack: 17568 kB' 'PageTables: 9000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53481716 kB' 'Committed_AS: 12005736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215488 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.368 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064864 kB' 'MemFree: 39936656 kB' 'MemUsed: 8128208 kB' 'SwapCached: 0 kB' 'Active: 5352748 kB' 'Inactive: 251764 kB' 'Active(anon): 4435276 kB' 'Inactive(anon): 0 kB' 'Active(file): 917472 kB' 'Inactive(file): 251764 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4850028 kB' 'Mapped: 98344 kB' 'AnonPages: 757556 kB' 'Shmem: 3680792 kB' 'KernelStack: 10616 kB' 'PageTables: 6120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 198128 kB' 'Slab: 563496 kB' 'SReclaimable: 198128 kB' 'SUnreclaim: 365368 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.369 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.370 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44220560 kB' 'MemFree: 31287952 kB' 'MemUsed: 12932608 kB' 'SwapCached: 0 kB' 'Active: 6484476 kB' 'Inactive: 3871944 kB' 'Active(anon): 6199864 kB' 'Inactive(anon): 0 kB' 'Active(file): 284612 kB' 'Inactive(file): 3871944 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10236004 kB' 'Mapped: 60620 kB' 'AnonPages: 120636 kB' 'Shmem: 6079448 kB' 'KernelStack: 7000 kB' 'PageTables: 3048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 339080 kB' 'Slab: 598804 kB' 'SReclaimable: 339080 kB' 'SUnreclaim: 259724 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.371 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:16.372 node0=513 expecting 513 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:16.372 node1=512 expecting 512 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:16.372 00:05:16.372 real 0m6.125s 00:05:16.372 user 0m2.110s 00:05:16.372 sys 0m4.017s 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:16.372 15:06:54 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:16.372 ************************************ 00:05:16.372 END TEST odd_alloc 00:05:16.372 ************************************ 00:05:16.372 15:06:54 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:16.372 15:06:54 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:16.372 15:06:54 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:16.372 15:06:54 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:16.372 ************************************ 00:05:16.372 START TEST custom_alloc 00:05:16.372 ************************************ 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1129 -- # custom_alloc 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:16.372 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:16.373 15:06:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:20.569 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:20.569 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:20.569 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:22.481 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 70184696 kB' 'MemAvailable: 75211744 kB' 'Buffers: 10272 kB' 'Cached: 15075880 kB' 'SwapCached: 0 kB' 'Active: 11838216 kB' 'Inactive: 4123708 kB' 'Active(anon): 10636132 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 878684 kB' 'Mapped: 159156 kB' 'Shmem: 9760360 kB' 'KReclaimable: 537176 kB' 'Slab: 1162144 kB' 'SReclaimable: 537176 kB' 'SUnreclaim: 624968 kB' 'KernelStack: 17552 kB' 'PageTables: 9060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52958452 kB' 'Committed_AS: 12006520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215552 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.482 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:22.483 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 70185848 kB' 'MemAvailable: 75212864 kB' 'Buffers: 10272 kB' 'Cached: 15075884 kB' 'SwapCached: 0 kB' 'Active: 11838408 kB' 'Inactive: 4123708 kB' 'Active(anon): 10636324 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 878876 kB' 'Mapped: 159024 kB' 'Shmem: 9760364 kB' 'KReclaimable: 537144 kB' 'Slab: 1162080 kB' 'SReclaimable: 537144 kB' 'SUnreclaim: 624936 kB' 'KernelStack: 17552 kB' 'PageTables: 9072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52958452 kB' 'Committed_AS: 12006684 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215520 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.484 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.485 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 70185764 kB' 'MemAvailable: 75212780 kB' 'Buffers: 10272 kB' 'Cached: 15075888 kB' 'SwapCached: 0 kB' 'Active: 11838320 kB' 'Inactive: 4123708 kB' 'Active(anon): 10636236 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 878856 kB' 'Mapped: 159024 kB' 'Shmem: 9760368 kB' 'KReclaimable: 537144 kB' 'Slab: 1162080 kB' 'SReclaimable: 537144 kB' 'SUnreclaim: 624936 kB' 'KernelStack: 17568 kB' 'PageTables: 9132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52958452 kB' 'Committed_AS: 12006556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215536 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.486 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.487 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:22.488 nr_hugepages=1536 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:22.488 resv_hugepages=0 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:22.488 surplus_hugepages=0 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:22.488 anon_hugepages=0 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:22.488 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 70186020 kB' 'MemAvailable: 75213036 kB' 'Buffers: 10272 kB' 'Cached: 15075888 kB' 'SwapCached: 0 kB' 'Active: 11838352 kB' 'Inactive: 4123708 kB' 'Active(anon): 10636268 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 878840 kB' 'Mapped: 159024 kB' 'Shmem: 9760368 kB' 'KReclaimable: 537144 kB' 'Slab: 1162080 kB' 'SReclaimable: 537144 kB' 'SUnreclaim: 624936 kB' 'KernelStack: 17504 kB' 'PageTables: 8900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52958452 kB' 'Committed_AS: 12006580 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215520 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.489 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.490 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:22.491 15:07:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064864 kB' 'MemFree: 39958732 kB' 'MemUsed: 8106132 kB' 'SwapCached: 0 kB' 'Active: 5353296 kB' 'Inactive: 251764 kB' 'Active(anon): 4435824 kB' 'Inactive(anon): 0 kB' 'Active(file): 917472 kB' 'Inactive(file): 251764 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4850044 kB' 'Mapped: 98344 kB' 'AnonPages: 757804 kB' 'Shmem: 3680808 kB' 'KernelStack: 10600 kB' 'PageTables: 6216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 198064 kB' 'Slab: 563052 kB' 'SReclaimable: 198064 kB' 'SUnreclaim: 364988 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.491 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.492 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44220560 kB' 'MemFree: 30227064 kB' 'MemUsed: 13993496 kB' 'SwapCached: 0 kB' 'Active: 6485136 kB' 'Inactive: 3871944 kB' 'Active(anon): 6200524 kB' 'Inactive(anon): 0 kB' 'Active(file): 284612 kB' 'Inactive(file): 3871944 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10236192 kB' 'Mapped: 60680 kB' 'AnonPages: 121052 kB' 'Shmem: 6079636 kB' 'KernelStack: 6952 kB' 'PageTables: 2848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 339080 kB' 'Slab: 599036 kB' 'SReclaimable: 339080 kB' 'SUnreclaim: 259956 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.493 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.494 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:22.495 node0=512 expecting 512 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:22.495 node1=1024 expecting 1024 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:22.495 00:05:22.495 real 0m6.163s 00:05:22.495 user 0m2.112s 00:05:22.495 sys 0m4.080s 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:22.495 15:07:01 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:22.495 ************************************ 00:05:22.495 END TEST custom_alloc 00:05:22.495 ************************************ 00:05:22.495 15:07:01 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:22.495 15:07:01 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:22.495 15:07:01 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:22.495 15:07:01 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:22.495 ************************************ 00:05:22.495 START TEST no_shrink_alloc 00:05:22.495 ************************************ 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1129 -- # no_shrink_alloc 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:22.495 15:07:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:26.695 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:26.695 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:26.695 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:26.695 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:26.695 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:26.695 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:26.695 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:26.695 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:26.695 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:26.695 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:26.696 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:26.696 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:26.696 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:26.696 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:26.696 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:26.696 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:26.696 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71216752 kB' 'MemAvailable: 76243768 kB' 'Buffers: 10272 kB' 'Cached: 15076092 kB' 'SwapCached: 0 kB' 'Active: 11839744 kB' 'Inactive: 4123708 kB' 'Active(anon): 10637660 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 880496 kB' 'Mapped: 159184 kB' 'Shmem: 9760572 kB' 'KReclaimable: 537144 kB' 'Slab: 1162948 kB' 'SReclaimable: 537144 kB' 'SUnreclaim: 625804 kB' 'KernelStack: 17472 kB' 'PageTables: 8856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12010008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215584 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.610 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.611 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71216308 kB' 'MemAvailable: 76243324 kB' 'Buffers: 10272 kB' 'Cached: 15076096 kB' 'SwapCached: 0 kB' 'Active: 11840248 kB' 'Inactive: 4123708 kB' 'Active(anon): 10638164 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 880992 kB' 'Mapped: 159112 kB' 'Shmem: 9760576 kB' 'KReclaimable: 537144 kB' 'Slab: 1162920 kB' 'SReclaimable: 537144 kB' 'SUnreclaim: 625776 kB' 'KernelStack: 17840 kB' 'PageTables: 9804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12010028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215648 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.612 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.613 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71221044 kB' 'MemAvailable: 76248060 kB' 'Buffers: 10272 kB' 'Cached: 15076112 kB' 'SwapCached: 0 kB' 'Active: 11840736 kB' 'Inactive: 4123708 kB' 'Active(anon): 10638652 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 881412 kB' 'Mapped: 159096 kB' 'Shmem: 9760592 kB' 'KReclaimable: 537144 kB' 'Slab: 1162824 kB' 'SReclaimable: 537144 kB' 'SUnreclaim: 625680 kB' 'KernelStack: 17984 kB' 'PageTables: 10152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12010048 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215632 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:28.614 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.614 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:28.615 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:28.615 nr_hugepages=1024 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:28.616 resv_hugepages=0 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:28.616 surplus_hugepages=0 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:28.616 anon_hugepages=0 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71220860 kB' 'MemAvailable: 76247876 kB' 'Buffers: 10272 kB' 'Cached: 15076136 kB' 'SwapCached: 0 kB' 'Active: 11840644 kB' 'Inactive: 4123708 kB' 'Active(anon): 10638560 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 881228 kB' 'Mapped: 159096 kB' 'Shmem: 9760616 kB' 'KReclaimable: 537144 kB' 'Slab: 1162696 kB' 'SReclaimable: 537144 kB' 'SUnreclaim: 625552 kB' 'KernelStack: 17904 kB' 'PageTables: 9884 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12008568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215616 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.616 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:28.617 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064864 kB' 'MemFree: 38908052 kB' 'MemUsed: 9156812 kB' 'SwapCached: 0 kB' 'Active: 5354872 kB' 'Inactive: 251764 kB' 'Active(anon): 4437400 kB' 'Inactive(anon): 0 kB' 'Active(file): 917472 kB' 'Inactive(file): 251764 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4850084 kB' 'Mapped: 98348 kB' 'AnonPages: 759680 kB' 'Shmem: 3680848 kB' 'KernelStack: 10888 kB' 'PageTables: 7164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 198064 kB' 'Slab: 563028 kB' 'SReclaimable: 198064 kB' 'SUnreclaim: 364964 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.618 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:28.619 node0=1024 expecting 1024 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:28.619 15:07:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:32.818 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:32.818 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:32.818 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:34.730 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71243404 kB' 'MemAvailable: 76270380 kB' 'Buffers: 10272 kB' 'Cached: 15076272 kB' 'SwapCached: 0 kB' 'Active: 11838872 kB' 'Inactive: 4123708 kB' 'Active(anon): 10636788 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 879364 kB' 'Mapped: 159232 kB' 'Shmem: 9760752 kB' 'KReclaimable: 537104 kB' 'Slab: 1162908 kB' 'SReclaimable: 537104 kB' 'SUnreclaim: 625804 kB' 'KernelStack: 17568 kB' 'PageTables: 9060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12008188 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215552 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.730 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71241176 kB' 'MemAvailable: 76268144 kB' 'Buffers: 10272 kB' 'Cached: 15076276 kB' 'SwapCached: 0 kB' 'Active: 11839072 kB' 'Inactive: 4123708 kB' 'Active(anon): 10636988 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 879572 kB' 'Mapped: 159096 kB' 'Shmem: 9760756 kB' 'KReclaimable: 537096 kB' 'Slab: 1162860 kB' 'SReclaimable: 537096 kB' 'SUnreclaim: 625764 kB' 'KernelStack: 17584 kB' 'PageTables: 9108 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12008208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215536 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.731 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71241680 kB' 'MemAvailable: 76268648 kB' 'Buffers: 10272 kB' 'Cached: 15076276 kB' 'SwapCached: 0 kB' 'Active: 11839108 kB' 'Inactive: 4123708 kB' 'Active(anon): 10637024 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 879604 kB' 'Mapped: 159096 kB' 'Shmem: 9760756 kB' 'KReclaimable: 537096 kB' 'Slab: 1162860 kB' 'SReclaimable: 537096 kB' 'SUnreclaim: 625764 kB' 'KernelStack: 17600 kB' 'PageTables: 9160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12008228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215536 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.732 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:34.733 nr_hugepages=1024 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:34.733 resv_hugepages=0 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:34.733 surplus_hugepages=0 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:34.733 anon_hugepages=0 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285424 kB' 'MemFree: 71242048 kB' 'MemAvailable: 76269016 kB' 'Buffers: 10272 kB' 'Cached: 15076316 kB' 'SwapCached: 0 kB' 'Active: 11839168 kB' 'Inactive: 4123708 kB' 'Active(anon): 10637084 kB' 'Inactive(anon): 0 kB' 'Active(file): 1202084 kB' 'Inactive(file): 4123708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 879600 kB' 'Mapped: 159600 kB' 'Shmem: 9760796 kB' 'KReclaimable: 537096 kB' 'Slab: 1162860 kB' 'SReclaimable: 537096 kB' 'SUnreclaim: 625764 kB' 'KernelStack: 17568 kB' 'PageTables: 9056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482740 kB' 'Committed_AS: 12009608 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 215536 kB' 'VmallocChunk: 0 kB' 'Percpu: 80352 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 810424 kB' 'DirectMap2M: 20885504 kB' 'DirectMap1G: 80740352 kB' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.733 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064864 kB' 'MemFree: 38910172 kB' 'MemUsed: 9154692 kB' 'SwapCached: 0 kB' 'Active: 5361024 kB' 'Inactive: 251764 kB' 'Active(anon): 4443552 kB' 'Inactive(anon): 0 kB' 'Active(file): 917472 kB' 'Inactive(file): 251764 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4850152 kB' 'Mapped: 98496 kB' 'AnonPages: 765952 kB' 'Shmem: 3680916 kB' 'KernelStack: 10680 kB' 'PageTables: 6436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 198056 kB' 'Slab: 563220 kB' 'SReclaimable: 198056 kB' 'SUnreclaim: 365164 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.734 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:34.735 node0=1024 expecting 1024 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:34.735 00:05:34.735 real 0m12.163s 00:05:34.735 user 0m4.259s 00:05:34.735 sys 0m7.897s 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:34.735 15:07:13 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:34.735 ************************************ 00:05:34.735 END TEST no_shrink_alloc 00:05:34.735 ************************************ 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:34.735 15:07:13 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:34.735 00:05:34.735 real 0m40.639s 00:05:34.735 user 0m13.027s 00:05:34.735 sys 0m24.491s 00:05:34.735 15:07:13 setup.sh.hugepages -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:34.735 15:07:13 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:34.735 ************************************ 00:05:34.735 END TEST hugepages 00:05:34.735 ************************************ 00:05:34.735 15:07:13 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:34.735 15:07:13 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:34.735 15:07:13 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:34.735 15:07:13 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:34.995 ************************************ 00:05:34.995 START TEST driver 00:05:34.995 ************************************ 00:05:34.995 15:07:13 setup.sh.driver -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:34.995 * Looking for test storage... 00:05:34.996 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1693 -- # lcov --version 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:34.996 15:07:13 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:34.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.996 --rc genhtml_branch_coverage=1 00:05:34.996 --rc genhtml_function_coverage=1 00:05:34.996 --rc genhtml_legend=1 00:05:34.996 --rc geninfo_all_blocks=1 00:05:34.996 --rc geninfo_unexecuted_blocks=1 00:05:34.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.996 ' 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:34.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.996 --rc genhtml_branch_coverage=1 00:05:34.996 --rc genhtml_function_coverage=1 00:05:34.996 --rc genhtml_legend=1 00:05:34.996 --rc geninfo_all_blocks=1 00:05:34.996 --rc geninfo_unexecuted_blocks=1 00:05:34.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.996 ' 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:34.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.996 --rc genhtml_branch_coverage=1 00:05:34.996 --rc genhtml_function_coverage=1 00:05:34.996 --rc genhtml_legend=1 00:05:34.996 --rc geninfo_all_blocks=1 00:05:34.996 --rc geninfo_unexecuted_blocks=1 00:05:34.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.996 ' 00:05:34.996 15:07:13 setup.sh.driver -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:34.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.996 --rc genhtml_branch_coverage=1 00:05:34.996 --rc genhtml_function_coverage=1 00:05:34.996 --rc genhtml_legend=1 00:05:34.996 --rc geninfo_all_blocks=1 00:05:34.996 --rc geninfo_unexecuted_blocks=1 00:05:34.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.996 ' 00:05:34.996 15:07:13 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:34.996 15:07:13 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:34.996 15:07:13 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:43.118 15:07:20 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:43.118 15:07:20 setup.sh.driver -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:43.118 15:07:20 setup.sh.driver -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:43.118 15:07:20 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:43.118 ************************************ 00:05:43.118 START TEST guess_driver 00:05:43.118 ************************************ 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- common/autotest_common.sh@1129 -- # guess_driver 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:43.118 15:07:20 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 238 > 0 )) 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:43.118 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:43.118 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:43.118 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:43.118 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:43.118 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:43.118 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:43.118 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:43.118 Looking for driver=vfio-pci 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:43.118 15:07:21 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:46.410 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.411 15:07:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:49.704 15:07:28 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:49.704 15:07:28 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:49.704 15:07:28 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:51.616 15:07:30 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:51.616 15:07:30 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:51.616 15:07:30 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:51.616 15:07:30 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:59.738 00:05:59.738 real 0m16.045s 00:05:59.738 user 0m4.037s 00:05:59.738 sys 0m8.116s 00:05:59.738 15:07:37 setup.sh.driver.guess_driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:59.738 15:07:37 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:59.738 ************************************ 00:05:59.738 END TEST guess_driver 00:05:59.738 ************************************ 00:05:59.738 00:05:59.738 real 0m23.651s 00:05:59.738 user 0m6.302s 00:05:59.738 sys 0m12.750s 00:05:59.738 15:07:37 setup.sh.driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:59.738 15:07:37 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:59.738 ************************************ 00:05:59.738 END TEST driver 00:05:59.738 ************************************ 00:05:59.738 15:07:37 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:59.738 15:07:37 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:59.738 15:07:37 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:59.738 15:07:37 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:59.738 ************************************ 00:05:59.738 START TEST devices 00:05:59.738 ************************************ 00:05:59.738 15:07:37 setup.sh.devices -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:59.738 * Looking for test storage... 00:05:59.738 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:59.738 15:07:37 setup.sh.devices -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:59.738 15:07:37 setup.sh.devices -- common/autotest_common.sh@1693 -- # lcov --version 00:05:59.738 15:07:37 setup.sh.devices -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:59.738 15:07:37 setup.sh.devices -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:59.738 15:07:37 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:59.738 15:07:37 setup.sh.devices -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:59.738 15:07:37 setup.sh.devices -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:59.738 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.738 --rc genhtml_branch_coverage=1 00:05:59.738 --rc genhtml_function_coverage=1 00:05:59.738 --rc genhtml_legend=1 00:05:59.738 --rc geninfo_all_blocks=1 00:05:59.738 --rc geninfo_unexecuted_blocks=1 00:05:59.738 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.738 ' 00:05:59.738 15:07:37 setup.sh.devices -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:59.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.739 --rc genhtml_branch_coverage=1 00:05:59.739 --rc genhtml_function_coverage=1 00:05:59.739 --rc genhtml_legend=1 00:05:59.739 --rc geninfo_all_blocks=1 00:05:59.739 --rc geninfo_unexecuted_blocks=1 00:05:59.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.739 ' 00:05:59.739 15:07:37 setup.sh.devices -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:59.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.739 --rc genhtml_branch_coverage=1 00:05:59.739 --rc genhtml_function_coverage=1 00:05:59.739 --rc genhtml_legend=1 00:05:59.739 --rc geninfo_all_blocks=1 00:05:59.739 --rc geninfo_unexecuted_blocks=1 00:05:59.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.739 ' 00:05:59.739 15:07:37 setup.sh.devices -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:59.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.739 --rc genhtml_branch_coverage=1 00:05:59.739 --rc genhtml_function_coverage=1 00:05:59.739 --rc genhtml_legend=1 00:05:59.739 --rc geninfo_all_blocks=1 00:05:59.739 --rc geninfo_unexecuted_blocks=1 00:05:59.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.739 ' 00:05:59.739 15:07:37 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:59.739 15:07:37 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:59.739 15:07:37 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:59.739 15:07:37 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1658 -- # local nvme bdf 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:1a:00.0 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:06:05.017 15:07:43 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:06:05.017 15:07:43 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:06:05.017 No valid GPT data, bailing 00:06:05.017 15:07:43 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:06:05.017 15:07:43 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:06:05.017 15:07:43 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:06:05.017 15:07:43 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:06:05.017 15:07:43 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:06:05.017 15:07:43 setup.sh.devices -- setup/common.sh@80 -- # echo 4000787030016 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@204 -- # (( 4000787030016 >= min_disk_size )) 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:1a:00.0 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:06:05.017 15:07:43 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:05.017 15:07:43 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:05.017 ************************************ 00:06:05.017 START TEST nvme_mount 00:06:05.017 ************************************ 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1129 -- # nvme_mount 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:05.017 15:07:43 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:06:05.957 Creating new GPT entries in memory. 00:06:05.957 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:05.957 other utilities. 00:06:05.957 15:07:44 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:05.957 15:07:44 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:05.957 15:07:44 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:05.957 15:07:44 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:05.957 15:07:44 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:07.335 Creating new GPT entries in memory. 00:06:07.335 The operation has completed successfully. 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1443743 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:07.335 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:07.336 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:07.336 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:07.336 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:07.336 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.336 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:07.336 15:07:45 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:07.336 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:07.336 15:07:45 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:11.533 15:07:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.909 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:12.909 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:12.909 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.168 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:13.168 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:13.168 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:06:13.169 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.169 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.169 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:13.169 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:13.169 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:13.169 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:13.169 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:13.428 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:13.428 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:06:13.428 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:13.428 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:13.428 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:13.428 15:07:51 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:13.428 15:07:51 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.428 15:07:51 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:13.429 15:07:51 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:16.718 15:07:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:16.718 15:07:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:1a:00.0 data@nvme0n1 '' '' 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:18.623 15:07:57 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.818 15:08:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:24.725 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:24.725 00:06:24.725 real 0m19.332s 00:06:24.725 user 0m4.998s 00:06:24.725 sys 0m11.750s 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:24.725 15:08:02 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:24.725 ************************************ 00:06:24.725 END TEST nvme_mount 00:06:24.725 ************************************ 00:06:24.725 15:08:02 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:24.725 15:08:02 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:24.725 15:08:02 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:24.725 15:08:02 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:24.725 ************************************ 00:06:24.725 START TEST dm_mount 00:06:24.725 ************************************ 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- common/autotest_common.sh@1129 -- # dm_mount 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:24.725 15:08:03 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:25.662 Creating new GPT entries in memory. 00:06:25.662 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:25.662 other utilities. 00:06:25.662 15:08:04 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:25.662 15:08:04 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:25.662 15:08:04 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:25.662 15:08:04 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:25.662 15:08:04 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:26.599 Creating new GPT entries in memory. 00:06:26.599 The operation has completed successfully. 00:06:26.599 15:08:05 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:26.599 15:08:05 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:26.599 15:08:05 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:26.599 15:08:05 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:26.599 15:08:05 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:27.538 The operation has completed successfully. 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1448937 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:27.538 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:27.539 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:27.539 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:27.539 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:27.539 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:1a:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:27.798 15:08:06 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.993 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:31.994 15:08:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:1a:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:33.911 15:08:12 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:38.108 15:08:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:40.017 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:40.017 00:06:40.017 real 0m15.490s 00:06:40.017 user 0m4.299s 00:06:40.017 sys 0m8.203s 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:40.017 15:08:18 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:40.017 ************************************ 00:06:40.017 END TEST dm_mount 00:06:40.017 ************************************ 00:06:40.017 15:08:18 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:40.017 15:08:18 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:40.017 15:08:18 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:40.017 15:08:18 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:40.017 15:08:18 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:40.017 15:08:18 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:40.017 15:08:18 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:40.278 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:40.278 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:06:40.278 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:40.278 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:40.278 15:08:18 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:40.278 15:08:18 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:40.278 15:08:18 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:40.278 15:08:18 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:40.278 15:08:18 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:40.278 15:08:18 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:40.278 15:08:18 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:40.278 00:06:40.278 real 0m41.684s 00:06:40.278 user 0m11.607s 00:06:40.278 sys 0m24.341s 00:06:40.278 15:08:18 setup.sh.devices -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:40.278 15:08:18 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:40.278 ************************************ 00:06:40.278 END TEST devices 00:06:40.278 ************************************ 00:06:40.278 00:06:40.278 real 2m24.718s 00:06:40.278 user 0m42.012s 00:06:40.278 sys 1m25.246s 00:06:40.278 15:08:18 setup.sh -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:40.278 15:08:18 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:40.278 ************************************ 00:06:40.278 END TEST setup.sh 00:06:40.278 ************************************ 00:06:40.278 15:08:18 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:44.476 Hugepages 00:06:44.476 node hugesize free / total 00:06:44.476 node0 1048576kB 0 / 0 00:06:44.476 node0 2048kB 1024 / 1024 00:06:44.476 node1 1048576kB 0 / 0 00:06:44.476 node1 2048kB 1024 / 1024 00:06:44.476 00:06:44.476 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:44.477 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:44.477 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:44.477 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:44.477 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:44.477 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:44.477 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:44.477 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:44.477 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:44.477 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:06:44.477 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:44.477 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:44.477 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:44.477 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:44.477 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:44.477 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:44.477 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:44.477 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:44.477 15:08:22 -- spdk/autotest.sh@117 -- # uname -s 00:06:44.477 15:08:22 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:44.477 15:08:22 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:44.477 15:08:22 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:48.674 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:48.674 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:51.212 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:06:53.749 15:08:32 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:54.689 15:08:33 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:54.689 15:08:33 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:54.689 15:08:33 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:06:54.689 15:08:33 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:06:54.689 15:08:33 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:54.689 15:08:33 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:54.689 15:08:33 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:54.689 15:08:33 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:54.689 15:08:33 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:54.689 15:08:33 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:54.689 15:08:33 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:1a:00.0 00:06:54.689 15:08:33 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:58.886 Waiting for block devices as requested 00:06:58.886 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:06:58.886 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:58.886 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:58.887 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:58.887 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:58.887 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:58.887 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:58.887 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:59.147 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:59.147 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:59.147 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:59.147 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:59.408 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:59.408 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:59.408 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:59.668 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:59.668 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:07:02.205 15:08:40 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:07:02.205 15:08:40 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:1a:00.0 00:07:02.205 15:08:40 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:07:02.205 15:08:40 -- common/autotest_common.sh@1487 -- # grep 0000:1a:00.0/nvme/nvme 00:07:02.205 15:08:40 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:07:02.205 15:08:40 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 ]] 00:07:02.205 15:08:40 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:07:02.205 15:08:40 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:07:02.205 15:08:40 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:07:02.205 15:08:40 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:07:02.205 15:08:40 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:07:02.205 15:08:40 -- common/autotest_common.sh@1531 -- # grep oacs 00:07:02.205 15:08:40 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:07:02.205 15:08:40 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:07:02.205 15:08:40 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:07:02.205 15:08:40 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:07:02.205 15:08:40 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:07:02.205 15:08:40 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:07:02.205 15:08:40 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:07:02.205 15:08:40 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:07:02.205 15:08:40 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:07:02.205 15:08:40 -- common/autotest_common.sh@1543 -- # continue 00:07:02.205 15:08:40 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:07:02.205 15:08:40 -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:02.205 15:08:40 -- common/autotest_common.sh@10 -- # set +x 00:07:02.205 15:08:40 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:07:02.205 15:08:40 -- common/autotest_common.sh@726 -- # xtrace_disable 00:07:02.205 15:08:40 -- common/autotest_common.sh@10 -- # set +x 00:07:02.205 15:08:40 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:07:06.403 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:07:06.403 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:08.939 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:07:11.472 15:08:49 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:07:11.472 15:08:49 -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:11.472 15:08:49 -- common/autotest_common.sh@10 -- # set +x 00:07:11.472 15:08:49 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:07:11.472 15:08:49 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:07:11.472 15:08:49 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:07:11.472 15:08:49 -- common/autotest_common.sh@1563 -- # bdfs=() 00:07:11.472 15:08:49 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:07:11.472 15:08:49 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:07:11.472 15:08:49 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:07:11.472 15:08:49 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:07:11.472 15:08:49 -- common/autotest_common.sh@1498 -- # bdfs=() 00:07:11.472 15:08:49 -- common/autotest_common.sh@1498 -- # local bdfs 00:07:11.472 15:08:49 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:11.472 15:08:49 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:07:11.472 15:08:49 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:07:11.472 15:08:49 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:07:11.472 15:08:49 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:1a:00.0 00:07:11.472 15:08:49 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:07:11.472 15:08:49 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:1a:00.0/device 00:07:11.472 15:08:49 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:07:11.472 15:08:49 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:07:11.472 15:08:49 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:07:11.472 15:08:49 -- common/autotest_common.sh@1572 -- # (( 1 > 0 )) 00:07:11.472 15:08:49 -- common/autotest_common.sh@1573 -- # printf '%s\n' 0000:1a:00.0 00:07:11.472 15:08:49 -- common/autotest_common.sh@1579 -- # [[ -z 0000:1a:00.0 ]] 00:07:11.472 15:08:49 -- common/autotest_common.sh@1584 -- # spdk_tgt_pid=1459973 00:07:11.472 15:08:49 -- common/autotest_common.sh@1583 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:11.472 15:08:49 -- common/autotest_common.sh@1585 -- # waitforlisten 1459973 00:07:11.472 15:08:49 -- common/autotest_common.sh@835 -- # '[' -z 1459973 ']' 00:07:11.472 15:08:49 -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:11.472 15:08:49 -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:11.472 15:08:49 -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:11.472 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:11.472 15:08:49 -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:11.472 15:08:49 -- common/autotest_common.sh@10 -- # set +x 00:07:11.472 [2024-11-20 15:08:50.012693] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:11.472 [2024-11-20 15:08:50.012769] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1459973 ] 00:07:11.472 [2024-11-20 15:08:50.103562] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.472 [2024-11-20 15:08:50.127973] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.731 15:08:50 -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:11.731 15:08:50 -- common/autotest_common.sh@868 -- # return 0 00:07:11.731 15:08:50 -- common/autotest_common.sh@1587 -- # bdf_id=0 00:07:11.731 15:08:50 -- common/autotest_common.sh@1588 -- # for bdf in "${bdfs[@]}" 00:07:11.731 15:08:50 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:1a:00.0 00:07:15.014 nvme0n1 00:07:15.015 15:08:53 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:07:15.015 [2024-11-20 15:08:53.559120] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:07:15.015 request: 00:07:15.015 { 00:07:15.015 "nvme_ctrlr_name": "nvme0", 00:07:15.015 "password": "test", 00:07:15.015 "method": "bdev_nvme_opal_revert", 00:07:15.015 "req_id": 1 00:07:15.015 } 00:07:15.015 Got JSON-RPC error response 00:07:15.015 response: 00:07:15.015 { 00:07:15.015 "code": -32602, 00:07:15.015 "message": "Invalid parameters" 00:07:15.015 } 00:07:15.015 15:08:53 -- common/autotest_common.sh@1591 -- # true 00:07:15.015 15:08:53 -- common/autotest_common.sh@1592 -- # (( ++bdf_id )) 00:07:15.015 15:08:53 -- common/autotest_common.sh@1595 -- # killprocess 1459973 00:07:15.015 15:08:53 -- common/autotest_common.sh@954 -- # '[' -z 1459973 ']' 00:07:15.015 15:08:53 -- common/autotest_common.sh@958 -- # kill -0 1459973 00:07:15.015 15:08:53 -- common/autotest_common.sh@959 -- # uname 00:07:15.015 15:08:53 -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:15.015 15:08:53 -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1459973 00:07:15.015 15:08:53 -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:15.015 15:08:53 -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:15.015 15:08:53 -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1459973' 00:07:15.015 killing process with pid 1459973 00:07:15.015 15:08:53 -- common/autotest_common.sh@973 -- # kill 1459973 00:07:15.015 15:08:53 -- common/autotest_common.sh@978 -- # wait 1459973 00:07:19.198 15:08:57 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:07:19.198 15:08:57 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:07:19.198 15:08:57 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:07:19.198 15:08:57 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:07:19.198 15:08:57 -- spdk/autotest.sh@149 -- # timing_enter lib 00:07:19.198 15:08:57 -- common/autotest_common.sh@726 -- # xtrace_disable 00:07:19.198 15:08:57 -- common/autotest_common.sh@10 -- # set +x 00:07:19.198 15:08:57 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:07:19.198 15:08:57 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:07:19.198 15:08:57 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:19.198 15:08:57 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:19.198 15:08:57 -- common/autotest_common.sh@10 -- # set +x 00:07:19.198 ************************************ 00:07:19.198 START TEST env 00:07:19.198 ************************************ 00:07:19.198 15:08:57 env -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:07:19.198 * Looking for test storage... 00:07:19.198 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:07:19.198 15:08:57 env -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:19.198 15:08:57 env -- common/autotest_common.sh@1693 -- # lcov --version 00:07:19.198 15:08:57 env -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:19.198 15:08:57 env -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:19.198 15:08:57 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:19.198 15:08:57 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:19.198 15:08:57 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:19.198 15:08:57 env -- scripts/common.sh@336 -- # IFS=.-: 00:07:19.198 15:08:57 env -- scripts/common.sh@336 -- # read -ra ver1 00:07:19.198 15:08:57 env -- scripts/common.sh@337 -- # IFS=.-: 00:07:19.198 15:08:57 env -- scripts/common.sh@337 -- # read -ra ver2 00:07:19.198 15:08:57 env -- scripts/common.sh@338 -- # local 'op=<' 00:07:19.198 15:08:57 env -- scripts/common.sh@340 -- # ver1_l=2 00:07:19.198 15:08:57 env -- scripts/common.sh@341 -- # ver2_l=1 00:07:19.198 15:08:57 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:19.198 15:08:57 env -- scripts/common.sh@344 -- # case "$op" in 00:07:19.198 15:08:57 env -- scripts/common.sh@345 -- # : 1 00:07:19.198 15:08:57 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:19.198 15:08:57 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:19.198 15:08:57 env -- scripts/common.sh@365 -- # decimal 1 00:07:19.199 15:08:57 env -- scripts/common.sh@353 -- # local d=1 00:07:19.199 15:08:57 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:19.199 15:08:57 env -- scripts/common.sh@355 -- # echo 1 00:07:19.199 15:08:57 env -- scripts/common.sh@365 -- # ver1[v]=1 00:07:19.199 15:08:57 env -- scripts/common.sh@366 -- # decimal 2 00:07:19.199 15:08:57 env -- scripts/common.sh@353 -- # local d=2 00:07:19.199 15:08:57 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:19.199 15:08:57 env -- scripts/common.sh@355 -- # echo 2 00:07:19.199 15:08:57 env -- scripts/common.sh@366 -- # ver2[v]=2 00:07:19.199 15:08:57 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:19.199 15:08:57 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:19.199 15:08:57 env -- scripts/common.sh@368 -- # return 0 00:07:19.199 15:08:57 env -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:19.199 15:08:57 env -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:19.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.199 --rc genhtml_branch_coverage=1 00:07:19.199 --rc genhtml_function_coverage=1 00:07:19.199 --rc genhtml_legend=1 00:07:19.199 --rc geninfo_all_blocks=1 00:07:19.199 --rc geninfo_unexecuted_blocks=1 00:07:19.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.199 ' 00:07:19.199 15:08:57 env -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:19.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.199 --rc genhtml_branch_coverage=1 00:07:19.199 --rc genhtml_function_coverage=1 00:07:19.199 --rc genhtml_legend=1 00:07:19.199 --rc geninfo_all_blocks=1 00:07:19.199 --rc geninfo_unexecuted_blocks=1 00:07:19.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.199 ' 00:07:19.199 15:08:57 env -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:19.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.199 --rc genhtml_branch_coverage=1 00:07:19.199 --rc genhtml_function_coverage=1 00:07:19.199 --rc genhtml_legend=1 00:07:19.199 --rc geninfo_all_blocks=1 00:07:19.199 --rc geninfo_unexecuted_blocks=1 00:07:19.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.199 ' 00:07:19.199 15:08:57 env -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:19.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.199 --rc genhtml_branch_coverage=1 00:07:19.199 --rc genhtml_function_coverage=1 00:07:19.199 --rc genhtml_legend=1 00:07:19.199 --rc geninfo_all_blocks=1 00:07:19.199 --rc geninfo_unexecuted_blocks=1 00:07:19.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.199 ' 00:07:19.199 15:08:57 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:07:19.199 15:08:57 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:19.199 15:08:57 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:19.199 15:08:57 env -- common/autotest_common.sh@10 -- # set +x 00:07:19.199 ************************************ 00:07:19.199 START TEST env_memory 00:07:19.199 ************************************ 00:07:19.199 15:08:57 env.env_memory -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:07:19.199 00:07:19.199 00:07:19.199 CUnit - A unit testing framework for C - Version 2.1-3 00:07:19.199 http://cunit.sourceforge.net/ 00:07:19.199 00:07:19.199 00:07:19.199 Suite: memory 00:07:19.199 Test: alloc and free memory map ...[2024-11-20 15:08:57.880403] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:07:19.459 passed 00:07:19.459 Test: mem map translation ...[2024-11-20 15:08:57.893419] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:07:19.459 [2024-11-20 15:08:57.893440] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:07:19.459 [2024-11-20 15:08:57.893470] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:07:19.459 [2024-11-20 15:08:57.893480] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:07:19.459 passed 00:07:19.459 Test: mem map registration ...[2024-11-20 15:08:57.913790] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:07:19.459 [2024-11-20 15:08:57.913806] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:07:19.459 passed 00:07:19.459 Test: mem map adjacent registrations ...passed 00:07:19.459 00:07:19.459 Run Summary: Type Total Ran Passed Failed Inactive 00:07:19.459 suites 1 1 n/a 0 0 00:07:19.459 tests 4 4 4 0 0 00:07:19.459 asserts 152 152 152 0 n/a 00:07:19.459 00:07:19.459 Elapsed time = 0.076 seconds 00:07:19.459 00:07:19.459 real 0m0.084s 00:07:19.459 user 0m0.075s 00:07:19.459 sys 0m0.009s 00:07:19.459 15:08:57 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:19.459 15:08:57 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:07:19.459 ************************************ 00:07:19.459 END TEST env_memory 00:07:19.459 ************************************ 00:07:19.459 15:08:57 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:19.459 15:08:57 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:19.459 15:08:57 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:19.459 15:08:57 env -- common/autotest_common.sh@10 -- # set +x 00:07:19.459 ************************************ 00:07:19.459 START TEST env_vtophys 00:07:19.459 ************************************ 00:07:19.459 15:08:58 env.env_vtophys -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:19.459 EAL: lib.eal log level changed from notice to debug 00:07:19.459 EAL: Detected lcore 0 as core 0 on socket 0 00:07:19.459 EAL: Detected lcore 1 as core 1 on socket 0 00:07:19.459 EAL: Detected lcore 2 as core 2 on socket 0 00:07:19.459 EAL: Detected lcore 3 as core 3 on socket 0 00:07:19.459 EAL: Detected lcore 4 as core 4 on socket 0 00:07:19.459 EAL: Detected lcore 5 as core 8 on socket 0 00:07:19.459 EAL: Detected lcore 6 as core 9 on socket 0 00:07:19.459 EAL: Detected lcore 7 as core 10 on socket 0 00:07:19.459 EAL: Detected lcore 8 as core 11 on socket 0 00:07:19.459 EAL: Detected lcore 9 as core 16 on socket 0 00:07:19.459 EAL: Detected lcore 10 as core 17 on socket 0 00:07:19.459 EAL: Detected lcore 11 as core 18 on socket 0 00:07:19.459 EAL: Detected lcore 12 as core 19 on socket 0 00:07:19.459 EAL: Detected lcore 13 as core 20 on socket 0 00:07:19.459 EAL: Detected lcore 14 as core 24 on socket 0 00:07:19.459 EAL: Detected lcore 15 as core 25 on socket 0 00:07:19.459 EAL: Detected lcore 16 as core 26 on socket 0 00:07:19.459 EAL: Detected lcore 17 as core 27 on socket 0 00:07:19.459 EAL: Detected lcore 18 as core 0 on socket 1 00:07:19.459 EAL: Detected lcore 19 as core 1 on socket 1 00:07:19.459 EAL: Detected lcore 20 as core 2 on socket 1 00:07:19.459 EAL: Detected lcore 21 as core 3 on socket 1 00:07:19.459 EAL: Detected lcore 22 as core 4 on socket 1 00:07:19.459 EAL: Detected lcore 23 as core 8 on socket 1 00:07:19.459 EAL: Detected lcore 24 as core 9 on socket 1 00:07:19.459 EAL: Detected lcore 25 as core 10 on socket 1 00:07:19.459 EAL: Detected lcore 26 as core 11 on socket 1 00:07:19.459 EAL: Detected lcore 27 as core 16 on socket 1 00:07:19.459 EAL: Detected lcore 28 as core 17 on socket 1 00:07:19.459 EAL: Detected lcore 29 as core 18 on socket 1 00:07:19.459 EAL: Detected lcore 30 as core 19 on socket 1 00:07:19.459 EAL: Detected lcore 31 as core 20 on socket 1 00:07:19.459 EAL: Detected lcore 32 as core 24 on socket 1 00:07:19.459 EAL: Detected lcore 33 as core 25 on socket 1 00:07:19.459 EAL: Detected lcore 34 as core 26 on socket 1 00:07:19.459 EAL: Detected lcore 35 as core 27 on socket 1 00:07:19.459 EAL: Detected lcore 36 as core 0 on socket 0 00:07:19.459 EAL: Detected lcore 37 as core 1 on socket 0 00:07:19.459 EAL: Detected lcore 38 as core 2 on socket 0 00:07:19.459 EAL: Detected lcore 39 as core 3 on socket 0 00:07:19.459 EAL: Detected lcore 40 as core 4 on socket 0 00:07:19.459 EAL: Detected lcore 41 as core 8 on socket 0 00:07:19.459 EAL: Detected lcore 42 as core 9 on socket 0 00:07:19.459 EAL: Detected lcore 43 as core 10 on socket 0 00:07:19.459 EAL: Detected lcore 44 as core 11 on socket 0 00:07:19.459 EAL: Detected lcore 45 as core 16 on socket 0 00:07:19.459 EAL: Detected lcore 46 as core 17 on socket 0 00:07:19.459 EAL: Detected lcore 47 as core 18 on socket 0 00:07:19.459 EAL: Detected lcore 48 as core 19 on socket 0 00:07:19.459 EAL: Detected lcore 49 as core 20 on socket 0 00:07:19.459 EAL: Detected lcore 50 as core 24 on socket 0 00:07:19.459 EAL: Detected lcore 51 as core 25 on socket 0 00:07:19.459 EAL: Detected lcore 52 as core 26 on socket 0 00:07:19.459 EAL: Detected lcore 53 as core 27 on socket 0 00:07:19.459 EAL: Detected lcore 54 as core 0 on socket 1 00:07:19.459 EAL: Detected lcore 55 as core 1 on socket 1 00:07:19.459 EAL: Detected lcore 56 as core 2 on socket 1 00:07:19.459 EAL: Detected lcore 57 as core 3 on socket 1 00:07:19.459 EAL: Detected lcore 58 as core 4 on socket 1 00:07:19.459 EAL: Detected lcore 59 as core 8 on socket 1 00:07:19.459 EAL: Detected lcore 60 as core 9 on socket 1 00:07:19.459 EAL: Detected lcore 61 as core 10 on socket 1 00:07:19.459 EAL: Detected lcore 62 as core 11 on socket 1 00:07:19.459 EAL: Detected lcore 63 as core 16 on socket 1 00:07:19.459 EAL: Detected lcore 64 as core 17 on socket 1 00:07:19.459 EAL: Detected lcore 65 as core 18 on socket 1 00:07:19.459 EAL: Detected lcore 66 as core 19 on socket 1 00:07:19.459 EAL: Detected lcore 67 as core 20 on socket 1 00:07:19.459 EAL: Detected lcore 68 as core 24 on socket 1 00:07:19.459 EAL: Detected lcore 69 as core 25 on socket 1 00:07:19.459 EAL: Detected lcore 70 as core 26 on socket 1 00:07:19.459 EAL: Detected lcore 71 as core 27 on socket 1 00:07:19.459 EAL: Maximum logical cores by configuration: 128 00:07:19.459 EAL: Detected CPU lcores: 72 00:07:19.459 EAL: Detected NUMA nodes: 2 00:07:19.459 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:07:19.459 EAL: Checking presence of .so 'librte_eal.so.24' 00:07:19.459 EAL: Checking presence of .so 'librte_eal.so' 00:07:19.459 EAL: Detected static linkage of DPDK 00:07:19.459 EAL: No shared files mode enabled, IPC will be disabled 00:07:19.459 EAL: Bus pci wants IOVA as 'DC' 00:07:19.459 EAL: Buses did not request a specific IOVA mode. 00:07:19.459 EAL: IOMMU is available, selecting IOVA as VA mode. 00:07:19.459 EAL: Selected IOVA mode 'VA' 00:07:19.459 EAL: Probing VFIO support... 00:07:19.459 EAL: IOMMU type 1 (Type 1) is supported 00:07:19.459 EAL: IOMMU type 7 (sPAPR) is not supported 00:07:19.459 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:07:19.459 EAL: VFIO support initialized 00:07:19.459 EAL: Ask a virtual area of 0x2e000 bytes 00:07:19.459 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:07:19.459 EAL: Setting up physically contiguous memory... 00:07:19.459 EAL: Setting maximum number of open files to 524288 00:07:19.459 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:07:19.459 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:07:19.459 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:07:19.459 EAL: Ask a virtual area of 0x61000 bytes 00:07:19.459 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:07:19.459 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:19.459 EAL: Ask a virtual area of 0x400000000 bytes 00:07:19.459 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:07:19.459 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:07:19.459 EAL: Ask a virtual area of 0x61000 bytes 00:07:19.459 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:07:19.459 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:19.459 EAL: Ask a virtual area of 0x400000000 bytes 00:07:19.459 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:07:19.459 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:07:19.459 EAL: Ask a virtual area of 0x61000 bytes 00:07:19.459 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:07:19.459 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:19.459 EAL: Ask a virtual area of 0x400000000 bytes 00:07:19.459 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:07:19.459 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:07:19.459 EAL: Ask a virtual area of 0x61000 bytes 00:07:19.459 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:07:19.459 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:19.460 EAL: Ask a virtual area of 0x400000000 bytes 00:07:19.460 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:07:19.460 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:07:19.460 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:07:19.460 EAL: Ask a virtual area of 0x61000 bytes 00:07:19.460 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:07:19.460 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:19.460 EAL: Ask a virtual area of 0x400000000 bytes 00:07:19.460 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:07:19.460 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:07:19.460 EAL: Ask a virtual area of 0x61000 bytes 00:07:19.460 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:07:19.460 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:19.460 EAL: Ask a virtual area of 0x400000000 bytes 00:07:19.460 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:07:19.460 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:07:19.460 EAL: Ask a virtual area of 0x61000 bytes 00:07:19.460 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:07:19.460 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:19.460 EAL: Ask a virtual area of 0x400000000 bytes 00:07:19.460 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:07:19.460 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:07:19.460 EAL: Ask a virtual area of 0x61000 bytes 00:07:19.460 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:07:19.460 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:19.460 EAL: Ask a virtual area of 0x400000000 bytes 00:07:19.460 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:07:19.460 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:07:19.460 EAL: Hugepages will be freed exactly as allocated. 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: TSC frequency is ~2300000 KHz 00:07:19.460 EAL: Main lcore 0 is ready (tid=7f09e4df2a00;cpuset=[0]) 00:07:19.460 EAL: Trying to obtain current memory policy. 00:07:19.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.460 EAL: Restoring previous memory policy: 0 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was expanded by 2MB 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Mem event callback 'spdk:(nil)' registered 00:07:19.460 00:07:19.460 00:07:19.460 CUnit - A unit testing framework for C - Version 2.1-3 00:07:19.460 http://cunit.sourceforge.net/ 00:07:19.460 00:07:19.460 00:07:19.460 Suite: components_suite 00:07:19.460 Test: vtophys_malloc_test ...passed 00:07:19.460 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:07:19.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.460 EAL: Restoring previous memory policy: 4 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was expanded by 4MB 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was shrunk by 4MB 00:07:19.460 EAL: Trying to obtain current memory policy. 00:07:19.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.460 EAL: Restoring previous memory policy: 4 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was expanded by 6MB 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was shrunk by 6MB 00:07:19.460 EAL: Trying to obtain current memory policy. 00:07:19.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.460 EAL: Restoring previous memory policy: 4 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was expanded by 10MB 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was shrunk by 10MB 00:07:19.460 EAL: Trying to obtain current memory policy. 00:07:19.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.460 EAL: Restoring previous memory policy: 4 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was expanded by 18MB 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was shrunk by 18MB 00:07:19.460 EAL: Trying to obtain current memory policy. 00:07:19.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.460 EAL: Restoring previous memory policy: 4 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was expanded by 34MB 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was shrunk by 34MB 00:07:19.460 EAL: Trying to obtain current memory policy. 00:07:19.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.460 EAL: Restoring previous memory policy: 4 00:07:19.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.460 EAL: request: mp_malloc_sync 00:07:19.460 EAL: No shared files mode enabled, IPC is disabled 00:07:19.460 EAL: Heap on socket 0 was expanded by 66MB 00:07:19.719 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.719 EAL: request: mp_malloc_sync 00:07:19.719 EAL: No shared files mode enabled, IPC is disabled 00:07:19.719 EAL: Heap on socket 0 was shrunk by 66MB 00:07:19.719 EAL: Trying to obtain current memory policy. 00:07:19.719 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.719 EAL: Restoring previous memory policy: 4 00:07:19.719 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.719 EAL: request: mp_malloc_sync 00:07:19.719 EAL: No shared files mode enabled, IPC is disabled 00:07:19.719 EAL: Heap on socket 0 was expanded by 130MB 00:07:19.719 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.719 EAL: request: mp_malloc_sync 00:07:19.719 EAL: No shared files mode enabled, IPC is disabled 00:07:19.719 EAL: Heap on socket 0 was shrunk by 130MB 00:07:19.719 EAL: Trying to obtain current memory policy. 00:07:19.719 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.719 EAL: Restoring previous memory policy: 4 00:07:19.719 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.719 EAL: request: mp_malloc_sync 00:07:19.719 EAL: No shared files mode enabled, IPC is disabled 00:07:19.719 EAL: Heap on socket 0 was expanded by 258MB 00:07:19.719 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.719 EAL: request: mp_malloc_sync 00:07:19.719 EAL: No shared files mode enabled, IPC is disabled 00:07:19.719 EAL: Heap on socket 0 was shrunk by 258MB 00:07:19.719 EAL: Trying to obtain current memory policy. 00:07:19.719 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:19.976 EAL: Restoring previous memory policy: 4 00:07:19.976 EAL: Calling mem event callback 'spdk:(nil)' 00:07:19.976 EAL: request: mp_malloc_sync 00:07:19.976 EAL: No shared files mode enabled, IPC is disabled 00:07:19.976 EAL: Heap on socket 0 was expanded by 514MB 00:07:19.976 EAL: Calling mem event callback 'spdk:(nil)' 00:07:20.235 EAL: request: mp_malloc_sync 00:07:20.235 EAL: No shared files mode enabled, IPC is disabled 00:07:20.235 EAL: Heap on socket 0 was shrunk by 514MB 00:07:20.235 EAL: Trying to obtain current memory policy. 00:07:20.235 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:20.235 EAL: Restoring previous memory policy: 4 00:07:20.235 EAL: Calling mem event callback 'spdk:(nil)' 00:07:20.235 EAL: request: mp_malloc_sync 00:07:20.235 EAL: No shared files mode enabled, IPC is disabled 00:07:20.235 EAL: Heap on socket 0 was expanded by 1026MB 00:07:20.492 EAL: Calling mem event callback 'spdk:(nil)' 00:07:20.749 EAL: request: mp_malloc_sync 00:07:20.749 EAL: No shared files mode enabled, IPC is disabled 00:07:20.749 EAL: Heap on socket 0 was shrunk by 1026MB 00:07:20.749 passed 00:07:20.749 00:07:20.749 Run Summary: Type Total Ran Passed Failed Inactive 00:07:20.749 suites 1 1 n/a 0 0 00:07:20.749 tests 2 2 2 0 0 00:07:20.749 asserts 497 497 497 0 n/a 00:07:20.749 00:07:20.749 Elapsed time = 1.098 seconds 00:07:20.749 EAL: Calling mem event callback 'spdk:(nil)' 00:07:20.749 EAL: request: mp_malloc_sync 00:07:20.749 EAL: No shared files mode enabled, IPC is disabled 00:07:20.749 EAL: Heap on socket 0 was shrunk by 2MB 00:07:20.749 EAL: No shared files mode enabled, IPC is disabled 00:07:20.749 EAL: No shared files mode enabled, IPC is disabled 00:07:20.749 EAL: No shared files mode enabled, IPC is disabled 00:07:20.749 00:07:20.749 real 0m1.222s 00:07:20.749 user 0m0.692s 00:07:20.749 sys 0m0.500s 00:07:20.749 15:08:59 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:20.749 15:08:59 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:07:20.749 ************************************ 00:07:20.749 END TEST env_vtophys 00:07:20.749 ************************************ 00:07:20.749 15:08:59 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:07:20.749 15:08:59 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:20.749 15:08:59 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:20.749 15:08:59 env -- common/autotest_common.sh@10 -- # set +x 00:07:20.749 ************************************ 00:07:20.749 START TEST env_pci 00:07:20.749 ************************************ 00:07:20.749 15:08:59 env.env_pci -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:07:20.749 00:07:20.749 00:07:20.749 CUnit - A unit testing framework for C - Version 2.1-3 00:07:20.749 http://cunit.sourceforge.net/ 00:07:20.749 00:07:20.749 00:07:20.749 Suite: pci 00:07:20.749 Test: pci_hook ...[2024-11-20 15:08:59.326582] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1118:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1461295 has claimed it 00:07:20.749 EAL: Cannot find device (10000:00:01.0) 00:07:20.749 EAL: Failed to attach device on primary process 00:07:20.749 passed 00:07:20.749 00:07:20.749 Run Summary: Type Total Ran Passed Failed Inactive 00:07:20.749 suites 1 1 n/a 0 0 00:07:20.749 tests 1 1 1 0 0 00:07:20.749 asserts 25 25 25 0 n/a 00:07:20.749 00:07:20.749 Elapsed time = 0.039 seconds 00:07:20.749 00:07:20.749 real 0m0.058s 00:07:20.749 user 0m0.010s 00:07:20.749 sys 0m0.048s 00:07:20.749 15:08:59 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:20.749 15:08:59 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:07:20.749 ************************************ 00:07:20.749 END TEST env_pci 00:07:20.749 ************************************ 00:07:20.749 15:08:59 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:07:20.749 15:08:59 env -- env/env.sh@15 -- # uname 00:07:20.749 15:08:59 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:07:20.749 15:08:59 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:07:20.749 15:08:59 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:20.749 15:08:59 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:07:20.749 15:08:59 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:20.749 15:08:59 env -- common/autotest_common.sh@10 -- # set +x 00:07:21.007 ************************************ 00:07:21.007 START TEST env_dpdk_post_init 00:07:21.007 ************************************ 00:07:21.007 15:08:59 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:21.007 EAL: Detected CPU lcores: 72 00:07:21.007 EAL: Detected NUMA nodes: 2 00:07:21.007 EAL: Detected static linkage of DPDK 00:07:21.007 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:21.007 EAL: Selected IOVA mode 'VA' 00:07:21.007 EAL: VFIO support initialized 00:07:21.007 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:21.007 EAL: Using IOMMU type 1 (Type 1) 00:07:21.940 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:1a:00.0 (socket 0) 00:07:27.440 EAL: Releasing PCI mapped resource for 0000:1a:00.0 00:07:27.440 EAL: Calling pci_unmap_resource for 0000:1a:00.0 at 0x202001000000 00:07:27.440 Starting DPDK initialization... 00:07:27.440 Starting SPDK post initialization... 00:07:27.440 SPDK NVMe probe 00:07:27.440 Attaching to 0000:1a:00.0 00:07:27.440 Attached to 0000:1a:00.0 00:07:27.440 Cleaning up... 00:07:27.440 00:07:27.440 real 0m6.526s 00:07:27.440 user 0m4.989s 00:07:27.440 sys 0m0.785s 00:07:27.440 15:09:05 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:27.440 15:09:05 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:07:27.440 ************************************ 00:07:27.440 END TEST env_dpdk_post_init 00:07:27.440 ************************************ 00:07:27.440 15:09:06 env -- env/env.sh@26 -- # uname 00:07:27.440 15:09:06 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:07:27.440 15:09:06 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:27.440 15:09:06 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:27.440 15:09:06 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:27.440 15:09:06 env -- common/autotest_common.sh@10 -- # set +x 00:07:27.440 ************************************ 00:07:27.440 START TEST env_mem_callbacks 00:07:27.440 ************************************ 00:07:27.440 15:09:06 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:27.440 EAL: Detected CPU lcores: 72 00:07:27.440 EAL: Detected NUMA nodes: 2 00:07:27.440 EAL: Detected static linkage of DPDK 00:07:27.440 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:27.440 EAL: Selected IOVA mode 'VA' 00:07:27.440 EAL: VFIO support initialized 00:07:27.725 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:27.725 00:07:27.725 00:07:27.725 CUnit - A unit testing framework for C - Version 2.1-3 00:07:27.725 http://cunit.sourceforge.net/ 00:07:27.725 00:07:27.725 00:07:27.725 Suite: memory 00:07:27.725 Test: test ... 00:07:27.725 register 0x200000200000 2097152 00:07:27.725 malloc 3145728 00:07:27.725 register 0x200000400000 4194304 00:07:27.725 buf 0x200000500000 len 3145728 PASSED 00:07:27.725 malloc 64 00:07:27.725 buf 0x2000004fff40 len 64 PASSED 00:07:27.725 malloc 4194304 00:07:27.725 register 0x200000800000 6291456 00:07:27.725 buf 0x200000a00000 len 4194304 PASSED 00:07:27.725 free 0x200000500000 3145728 00:07:27.725 free 0x2000004fff40 64 00:07:27.725 unregister 0x200000400000 4194304 PASSED 00:07:27.725 free 0x200000a00000 4194304 00:07:27.725 unregister 0x200000800000 6291456 PASSED 00:07:27.725 malloc 8388608 00:07:27.725 register 0x200000400000 10485760 00:07:27.725 buf 0x200000600000 len 8388608 PASSED 00:07:27.725 free 0x200000600000 8388608 00:07:27.725 unregister 0x200000400000 10485760 PASSED 00:07:27.725 passed 00:07:27.725 00:07:27.725 Run Summary: Type Total Ran Passed Failed Inactive 00:07:27.725 suites 1 1 n/a 0 0 00:07:27.725 tests 1 1 1 0 0 00:07:27.725 asserts 15 15 15 0 n/a 00:07:27.725 00:07:27.725 Elapsed time = 0.005 seconds 00:07:27.725 00:07:27.725 real 0m0.073s 00:07:27.725 user 0m0.017s 00:07:27.725 sys 0m0.056s 00:07:27.725 15:09:06 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:27.725 15:09:06 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:07:27.725 ************************************ 00:07:27.725 END TEST env_mem_callbacks 00:07:27.725 ************************************ 00:07:27.725 00:07:27.725 real 0m8.542s 00:07:27.725 user 0m6.019s 00:07:27.725 sys 0m1.786s 00:07:27.725 15:09:06 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:27.725 15:09:06 env -- common/autotest_common.sh@10 -- # set +x 00:07:27.725 ************************************ 00:07:27.725 END TEST env 00:07:27.725 ************************************ 00:07:27.725 15:09:06 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:07:27.725 15:09:06 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:27.725 15:09:06 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:27.725 15:09:06 -- common/autotest_common.sh@10 -- # set +x 00:07:27.725 ************************************ 00:07:27.725 START TEST rpc 00:07:27.725 ************************************ 00:07:27.725 15:09:06 rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:07:27.725 * Looking for test storage... 00:07:27.725 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:27.725 15:09:06 rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:27.725 15:09:06 rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:07:27.725 15:09:06 rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:27.725 15:09:06 rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:27.725 15:09:06 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:27.725 15:09:06 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:27.725 15:09:06 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:27.725 15:09:06 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:07:27.725 15:09:06 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:07:27.725 15:09:06 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:07:27.725 15:09:06 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:07:27.725 15:09:06 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:07:27.725 15:09:06 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:07:27.725 15:09:06 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:07:27.725 15:09:06 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:27.725 15:09:06 rpc -- scripts/common.sh@344 -- # case "$op" in 00:07:27.725 15:09:06 rpc -- scripts/common.sh@345 -- # : 1 00:07:27.725 15:09:06 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:27.725 15:09:06 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:27.725 15:09:06 rpc -- scripts/common.sh@365 -- # decimal 1 00:07:28.009 15:09:06 rpc -- scripts/common.sh@353 -- # local d=1 00:07:28.009 15:09:06 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:28.009 15:09:06 rpc -- scripts/common.sh@355 -- # echo 1 00:07:28.009 15:09:06 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:07:28.009 15:09:06 rpc -- scripts/common.sh@366 -- # decimal 2 00:07:28.009 15:09:06 rpc -- scripts/common.sh@353 -- # local d=2 00:07:28.009 15:09:06 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:28.009 15:09:06 rpc -- scripts/common.sh@355 -- # echo 2 00:07:28.009 15:09:06 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:07:28.009 15:09:06 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:28.009 15:09:06 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:28.009 15:09:06 rpc -- scripts/common.sh@368 -- # return 0 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:28.009 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:28.009 --rc genhtml_branch_coverage=1 00:07:28.009 --rc genhtml_function_coverage=1 00:07:28.009 --rc genhtml_legend=1 00:07:28.009 --rc geninfo_all_blocks=1 00:07:28.009 --rc geninfo_unexecuted_blocks=1 00:07:28.009 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:28.009 ' 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:28.009 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:28.009 --rc genhtml_branch_coverage=1 00:07:28.009 --rc genhtml_function_coverage=1 00:07:28.009 --rc genhtml_legend=1 00:07:28.009 --rc geninfo_all_blocks=1 00:07:28.009 --rc geninfo_unexecuted_blocks=1 00:07:28.009 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:28.009 ' 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:28.009 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:28.009 --rc genhtml_branch_coverage=1 00:07:28.009 --rc genhtml_function_coverage=1 00:07:28.009 --rc genhtml_legend=1 00:07:28.009 --rc geninfo_all_blocks=1 00:07:28.009 --rc geninfo_unexecuted_blocks=1 00:07:28.009 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:28.009 ' 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:28.009 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:28.009 --rc genhtml_branch_coverage=1 00:07:28.009 --rc genhtml_function_coverage=1 00:07:28.009 --rc genhtml_legend=1 00:07:28.009 --rc geninfo_all_blocks=1 00:07:28.009 --rc geninfo_unexecuted_blocks=1 00:07:28.009 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:28.009 ' 00:07:28.009 15:09:06 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1462471 00:07:28.009 15:09:06 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:28.009 15:09:06 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:07:28.009 15:09:06 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1462471 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@835 -- # '[' -z 1462471 ']' 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:28.009 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:28.009 15:09:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:28.009 [2024-11-20 15:09:06.446577] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:28.009 [2024-11-20 15:09:06.446656] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1462471 ] 00:07:28.009 [2024-11-20 15:09:06.535853] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.009 [2024-11-20 15:09:06.559643] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:07:28.009 [2024-11-20 15:09:06.559689] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1462471' to capture a snapshot of events at runtime. 00:07:28.009 [2024-11-20 15:09:06.559699] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:07:28.009 [2024-11-20 15:09:06.559708] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:07:28.009 [2024-11-20 15:09:06.559716] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1462471 for offline analysis/debug. 00:07:28.009 [2024-11-20 15:09:06.560190] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.339 15:09:06 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:28.339 15:09:06 rpc -- common/autotest_common.sh@868 -- # return 0 00:07:28.339 15:09:06 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:28.339 15:09:06 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:28.339 15:09:06 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:07:28.339 15:09:06 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:07:28.339 15:09:06 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:28.339 15:09:06 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:28.339 15:09:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 ************************************ 00:07:28.339 START TEST rpc_integrity 00:07:28.339 ************************************ 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:28.339 { 00:07:28.339 "name": "Malloc0", 00:07:28.339 "aliases": [ 00:07:28.339 "77ab4eac-dfb9-485c-9c07-6724a4752d97" 00:07:28.339 ], 00:07:28.339 "product_name": "Malloc disk", 00:07:28.339 "block_size": 512, 00:07:28.339 "num_blocks": 16384, 00:07:28.339 "uuid": "77ab4eac-dfb9-485c-9c07-6724a4752d97", 00:07:28.339 "assigned_rate_limits": { 00:07:28.339 "rw_ios_per_sec": 0, 00:07:28.339 "rw_mbytes_per_sec": 0, 00:07:28.339 "r_mbytes_per_sec": 0, 00:07:28.339 "w_mbytes_per_sec": 0 00:07:28.339 }, 00:07:28.339 "claimed": false, 00:07:28.339 "zoned": false, 00:07:28.339 "supported_io_types": { 00:07:28.339 "read": true, 00:07:28.339 "write": true, 00:07:28.339 "unmap": true, 00:07:28.339 "flush": true, 00:07:28.339 "reset": true, 00:07:28.339 "nvme_admin": false, 00:07:28.339 "nvme_io": false, 00:07:28.339 "nvme_io_md": false, 00:07:28.339 "write_zeroes": true, 00:07:28.339 "zcopy": true, 00:07:28.339 "get_zone_info": false, 00:07:28.339 "zone_management": false, 00:07:28.339 "zone_append": false, 00:07:28.339 "compare": false, 00:07:28.339 "compare_and_write": false, 00:07:28.339 "abort": true, 00:07:28.339 "seek_hole": false, 00:07:28.339 "seek_data": false, 00:07:28.339 "copy": true, 00:07:28.339 "nvme_iov_md": false 00:07:28.339 }, 00:07:28.339 "memory_domains": [ 00:07:28.339 { 00:07:28.339 "dma_device_id": "system", 00:07:28.339 "dma_device_type": 1 00:07:28.339 }, 00:07:28.339 { 00:07:28.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:28.339 "dma_device_type": 2 00:07:28.339 } 00:07:28.339 ], 00:07:28.339 "driver_specific": {} 00:07:28.339 } 00:07:28.339 ]' 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 [2024-11-20 15:09:06.923767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:07:28.339 [2024-11-20 15:09:06.923803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:28.339 [2024-11-20 15:09:06.923821] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4f2e060 00:07:28.339 [2024-11-20 15:09:06.923830] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:28.339 [2024-11-20 15:09:06.924658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:28.339 [2024-11-20 15:09:06.924683] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:28.339 Passthru0 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:28.339 { 00:07:28.339 "name": "Malloc0", 00:07:28.339 "aliases": [ 00:07:28.339 "77ab4eac-dfb9-485c-9c07-6724a4752d97" 00:07:28.339 ], 00:07:28.339 "product_name": "Malloc disk", 00:07:28.339 "block_size": 512, 00:07:28.339 "num_blocks": 16384, 00:07:28.339 "uuid": "77ab4eac-dfb9-485c-9c07-6724a4752d97", 00:07:28.339 "assigned_rate_limits": { 00:07:28.339 "rw_ios_per_sec": 0, 00:07:28.339 "rw_mbytes_per_sec": 0, 00:07:28.339 "r_mbytes_per_sec": 0, 00:07:28.339 "w_mbytes_per_sec": 0 00:07:28.339 }, 00:07:28.339 "claimed": true, 00:07:28.339 "claim_type": "exclusive_write", 00:07:28.339 "zoned": false, 00:07:28.339 "supported_io_types": { 00:07:28.339 "read": true, 00:07:28.339 "write": true, 00:07:28.339 "unmap": true, 00:07:28.339 "flush": true, 00:07:28.339 "reset": true, 00:07:28.339 "nvme_admin": false, 00:07:28.339 "nvme_io": false, 00:07:28.339 "nvme_io_md": false, 00:07:28.339 "write_zeroes": true, 00:07:28.339 "zcopy": true, 00:07:28.339 "get_zone_info": false, 00:07:28.339 "zone_management": false, 00:07:28.339 "zone_append": false, 00:07:28.339 "compare": false, 00:07:28.339 "compare_and_write": false, 00:07:28.339 "abort": true, 00:07:28.339 "seek_hole": false, 00:07:28.339 "seek_data": false, 00:07:28.339 "copy": true, 00:07:28.339 "nvme_iov_md": false 00:07:28.339 }, 00:07:28.339 "memory_domains": [ 00:07:28.339 { 00:07:28.339 "dma_device_id": "system", 00:07:28.339 "dma_device_type": 1 00:07:28.339 }, 00:07:28.339 { 00:07:28.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:28.339 "dma_device_type": 2 00:07:28.339 } 00:07:28.339 ], 00:07:28.339 "driver_specific": {} 00:07:28.339 }, 00:07:28.339 { 00:07:28.339 "name": "Passthru0", 00:07:28.339 "aliases": [ 00:07:28.339 "2f547bf0-0344-5f04-8ed0-933555c0d4af" 00:07:28.339 ], 00:07:28.339 "product_name": "passthru", 00:07:28.339 "block_size": 512, 00:07:28.339 "num_blocks": 16384, 00:07:28.339 "uuid": "2f547bf0-0344-5f04-8ed0-933555c0d4af", 00:07:28.339 "assigned_rate_limits": { 00:07:28.339 "rw_ios_per_sec": 0, 00:07:28.339 "rw_mbytes_per_sec": 0, 00:07:28.339 "r_mbytes_per_sec": 0, 00:07:28.339 "w_mbytes_per_sec": 0 00:07:28.339 }, 00:07:28.339 "claimed": false, 00:07:28.339 "zoned": false, 00:07:28.339 "supported_io_types": { 00:07:28.339 "read": true, 00:07:28.339 "write": true, 00:07:28.339 "unmap": true, 00:07:28.339 "flush": true, 00:07:28.339 "reset": true, 00:07:28.339 "nvme_admin": false, 00:07:28.339 "nvme_io": false, 00:07:28.339 "nvme_io_md": false, 00:07:28.339 "write_zeroes": true, 00:07:28.339 "zcopy": true, 00:07:28.339 "get_zone_info": false, 00:07:28.339 "zone_management": false, 00:07:28.339 "zone_append": false, 00:07:28.339 "compare": false, 00:07:28.339 "compare_and_write": false, 00:07:28.339 "abort": true, 00:07:28.339 "seek_hole": false, 00:07:28.339 "seek_data": false, 00:07:28.339 "copy": true, 00:07:28.339 "nvme_iov_md": false 00:07:28.339 }, 00:07:28.339 "memory_domains": [ 00:07:28.339 { 00:07:28.339 "dma_device_id": "system", 00:07:28.339 "dma_device_type": 1 00:07:28.339 }, 00:07:28.339 { 00:07:28.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:28.339 "dma_device_type": 2 00:07:28.339 } 00:07:28.339 ], 00:07:28.339 "driver_specific": { 00:07:28.339 "passthru": { 00:07:28.339 "name": "Passthru0", 00:07:28.339 "base_bdev_name": "Malloc0" 00:07:28.339 } 00:07:28.339 } 00:07:28.339 } 00:07:28.339 ]' 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.339 15:09:06 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.339 15:09:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 15:09:07 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.339 15:09:07 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:28.339 15:09:07 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.339 15:09:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.339 15:09:07 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.339 15:09:07 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:28.339 15:09:07 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:28.598 15:09:07 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:28.598 00:07:28.598 real 0m0.253s 00:07:28.598 user 0m0.143s 00:07:28.598 sys 0m0.044s 00:07:28.598 15:09:07 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:28.598 15:09:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:28.598 ************************************ 00:07:28.598 END TEST rpc_integrity 00:07:28.598 ************************************ 00:07:28.598 15:09:07 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:07:28.598 15:09:07 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:28.598 15:09:07 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:28.598 15:09:07 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:28.598 ************************************ 00:07:28.598 START TEST rpc_plugins 00:07:28.598 ************************************ 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:07:28.598 { 00:07:28.598 "name": "Malloc1", 00:07:28.598 "aliases": [ 00:07:28.598 "f731c9b0-1f16-49f0-893c-251857112d46" 00:07:28.598 ], 00:07:28.598 "product_name": "Malloc disk", 00:07:28.598 "block_size": 4096, 00:07:28.598 "num_blocks": 256, 00:07:28.598 "uuid": "f731c9b0-1f16-49f0-893c-251857112d46", 00:07:28.598 "assigned_rate_limits": { 00:07:28.598 "rw_ios_per_sec": 0, 00:07:28.598 "rw_mbytes_per_sec": 0, 00:07:28.598 "r_mbytes_per_sec": 0, 00:07:28.598 "w_mbytes_per_sec": 0 00:07:28.598 }, 00:07:28.598 "claimed": false, 00:07:28.598 "zoned": false, 00:07:28.598 "supported_io_types": { 00:07:28.598 "read": true, 00:07:28.598 "write": true, 00:07:28.598 "unmap": true, 00:07:28.598 "flush": true, 00:07:28.598 "reset": true, 00:07:28.598 "nvme_admin": false, 00:07:28.598 "nvme_io": false, 00:07:28.598 "nvme_io_md": false, 00:07:28.598 "write_zeroes": true, 00:07:28.598 "zcopy": true, 00:07:28.598 "get_zone_info": false, 00:07:28.598 "zone_management": false, 00:07:28.598 "zone_append": false, 00:07:28.598 "compare": false, 00:07:28.598 "compare_and_write": false, 00:07:28.598 "abort": true, 00:07:28.598 "seek_hole": false, 00:07:28.598 "seek_data": false, 00:07:28.598 "copy": true, 00:07:28.598 "nvme_iov_md": false 00:07:28.598 }, 00:07:28.598 "memory_domains": [ 00:07:28.598 { 00:07:28.598 "dma_device_id": "system", 00:07:28.598 "dma_device_type": 1 00:07:28.598 }, 00:07:28.598 { 00:07:28.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:28.598 "dma_device_type": 2 00:07:28.598 } 00:07:28.598 ], 00:07:28.598 "driver_specific": {} 00:07:28.598 } 00:07:28.598 ]' 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:07:28.598 15:09:07 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:07:28.598 00:07:28.598 real 0m0.139s 00:07:28.598 user 0m0.080s 00:07:28.598 sys 0m0.021s 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:28.598 15:09:07 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:28.598 ************************************ 00:07:28.598 END TEST rpc_plugins 00:07:28.598 ************************************ 00:07:28.856 15:09:07 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:07:28.856 15:09:07 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:28.856 15:09:07 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:28.856 15:09:07 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:28.856 ************************************ 00:07:28.856 START TEST rpc_trace_cmd_test 00:07:28.856 ************************************ 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:07:28.856 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1462471", 00:07:28.856 "tpoint_group_mask": "0x8", 00:07:28.856 "iscsi_conn": { 00:07:28.856 "mask": "0x2", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "scsi": { 00:07:28.856 "mask": "0x4", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "bdev": { 00:07:28.856 "mask": "0x8", 00:07:28.856 "tpoint_mask": "0xffffffffffffffff" 00:07:28.856 }, 00:07:28.856 "nvmf_rdma": { 00:07:28.856 "mask": "0x10", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "nvmf_tcp": { 00:07:28.856 "mask": "0x20", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "ftl": { 00:07:28.856 "mask": "0x40", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "blobfs": { 00:07:28.856 "mask": "0x80", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "dsa": { 00:07:28.856 "mask": "0x200", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "thread": { 00:07:28.856 "mask": "0x400", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "nvme_pcie": { 00:07:28.856 "mask": "0x800", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "iaa": { 00:07:28.856 "mask": "0x1000", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "nvme_tcp": { 00:07:28.856 "mask": "0x2000", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "bdev_nvme": { 00:07:28.856 "mask": "0x4000", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "sock": { 00:07:28.856 "mask": "0x8000", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "blob": { 00:07:28.856 "mask": "0x10000", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "bdev_raid": { 00:07:28.856 "mask": "0x20000", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 }, 00:07:28.856 "scheduler": { 00:07:28.856 "mask": "0x40000", 00:07:28.856 "tpoint_mask": "0x0" 00:07:28.856 } 00:07:28.856 }' 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:07:28.856 00:07:28.856 real 0m0.174s 00:07:28.856 user 0m0.140s 00:07:28.856 sys 0m0.028s 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:28.856 15:09:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:28.856 ************************************ 00:07:28.856 END TEST rpc_trace_cmd_test 00:07:28.856 ************************************ 00:07:29.113 15:09:07 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:07:29.113 15:09:07 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:07:29.113 15:09:07 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:07:29.113 15:09:07 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:29.113 15:09:07 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:29.113 15:09:07 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:29.113 ************************************ 00:07:29.113 START TEST rpc_daemon_integrity 00:07:29.113 ************************************ 00:07:29.113 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:07:29.113 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:29.113 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:29.114 { 00:07:29.114 "name": "Malloc2", 00:07:29.114 "aliases": [ 00:07:29.114 "c5965b65-6cf0-486c-b176-997c1923e9fd" 00:07:29.114 ], 00:07:29.114 "product_name": "Malloc disk", 00:07:29.114 "block_size": 512, 00:07:29.114 "num_blocks": 16384, 00:07:29.114 "uuid": "c5965b65-6cf0-486c-b176-997c1923e9fd", 00:07:29.114 "assigned_rate_limits": { 00:07:29.114 "rw_ios_per_sec": 0, 00:07:29.114 "rw_mbytes_per_sec": 0, 00:07:29.114 "r_mbytes_per_sec": 0, 00:07:29.114 "w_mbytes_per_sec": 0 00:07:29.114 }, 00:07:29.114 "claimed": false, 00:07:29.114 "zoned": false, 00:07:29.114 "supported_io_types": { 00:07:29.114 "read": true, 00:07:29.114 "write": true, 00:07:29.114 "unmap": true, 00:07:29.114 "flush": true, 00:07:29.114 "reset": true, 00:07:29.114 "nvme_admin": false, 00:07:29.114 "nvme_io": false, 00:07:29.114 "nvme_io_md": false, 00:07:29.114 "write_zeroes": true, 00:07:29.114 "zcopy": true, 00:07:29.114 "get_zone_info": false, 00:07:29.114 "zone_management": false, 00:07:29.114 "zone_append": false, 00:07:29.114 "compare": false, 00:07:29.114 "compare_and_write": false, 00:07:29.114 "abort": true, 00:07:29.114 "seek_hole": false, 00:07:29.114 "seek_data": false, 00:07:29.114 "copy": true, 00:07:29.114 "nvme_iov_md": false 00:07:29.114 }, 00:07:29.114 "memory_domains": [ 00:07:29.114 { 00:07:29.114 "dma_device_id": "system", 00:07:29.114 "dma_device_type": 1 00:07:29.114 }, 00:07:29.114 { 00:07:29.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:29.114 "dma_device_type": 2 00:07:29.114 } 00:07:29.114 ], 00:07:29.114 "driver_specific": {} 00:07:29.114 } 00:07:29.114 ]' 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.114 [2024-11-20 15:09:07.741884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:07:29.114 [2024-11-20 15:09:07.741921] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:29.114 [2024-11-20 15:09:07.741937] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x505db80 00:07:29.114 [2024-11-20 15:09:07.741947] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:29.114 [2024-11-20 15:09:07.742744] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:29.114 [2024-11-20 15:09:07.742767] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:29.114 Passthru0 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:29.114 { 00:07:29.114 "name": "Malloc2", 00:07:29.114 "aliases": [ 00:07:29.114 "c5965b65-6cf0-486c-b176-997c1923e9fd" 00:07:29.114 ], 00:07:29.114 "product_name": "Malloc disk", 00:07:29.114 "block_size": 512, 00:07:29.114 "num_blocks": 16384, 00:07:29.114 "uuid": "c5965b65-6cf0-486c-b176-997c1923e9fd", 00:07:29.114 "assigned_rate_limits": { 00:07:29.114 "rw_ios_per_sec": 0, 00:07:29.114 "rw_mbytes_per_sec": 0, 00:07:29.114 "r_mbytes_per_sec": 0, 00:07:29.114 "w_mbytes_per_sec": 0 00:07:29.114 }, 00:07:29.114 "claimed": true, 00:07:29.114 "claim_type": "exclusive_write", 00:07:29.114 "zoned": false, 00:07:29.114 "supported_io_types": { 00:07:29.114 "read": true, 00:07:29.114 "write": true, 00:07:29.114 "unmap": true, 00:07:29.114 "flush": true, 00:07:29.114 "reset": true, 00:07:29.114 "nvme_admin": false, 00:07:29.114 "nvme_io": false, 00:07:29.114 "nvme_io_md": false, 00:07:29.114 "write_zeroes": true, 00:07:29.114 "zcopy": true, 00:07:29.114 "get_zone_info": false, 00:07:29.114 "zone_management": false, 00:07:29.114 "zone_append": false, 00:07:29.114 "compare": false, 00:07:29.114 "compare_and_write": false, 00:07:29.114 "abort": true, 00:07:29.114 "seek_hole": false, 00:07:29.114 "seek_data": false, 00:07:29.114 "copy": true, 00:07:29.114 "nvme_iov_md": false 00:07:29.114 }, 00:07:29.114 "memory_domains": [ 00:07:29.114 { 00:07:29.114 "dma_device_id": "system", 00:07:29.114 "dma_device_type": 1 00:07:29.114 }, 00:07:29.114 { 00:07:29.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:29.114 "dma_device_type": 2 00:07:29.114 } 00:07:29.114 ], 00:07:29.114 "driver_specific": {} 00:07:29.114 }, 00:07:29.114 { 00:07:29.114 "name": "Passthru0", 00:07:29.114 "aliases": [ 00:07:29.114 "905ed16a-ac07-598d-be3b-a29e3738f78f" 00:07:29.114 ], 00:07:29.114 "product_name": "passthru", 00:07:29.114 "block_size": 512, 00:07:29.114 "num_blocks": 16384, 00:07:29.114 "uuid": "905ed16a-ac07-598d-be3b-a29e3738f78f", 00:07:29.114 "assigned_rate_limits": { 00:07:29.114 "rw_ios_per_sec": 0, 00:07:29.114 "rw_mbytes_per_sec": 0, 00:07:29.114 "r_mbytes_per_sec": 0, 00:07:29.114 "w_mbytes_per_sec": 0 00:07:29.114 }, 00:07:29.114 "claimed": false, 00:07:29.114 "zoned": false, 00:07:29.114 "supported_io_types": { 00:07:29.114 "read": true, 00:07:29.114 "write": true, 00:07:29.114 "unmap": true, 00:07:29.114 "flush": true, 00:07:29.114 "reset": true, 00:07:29.114 "nvme_admin": false, 00:07:29.114 "nvme_io": false, 00:07:29.114 "nvme_io_md": false, 00:07:29.114 "write_zeroes": true, 00:07:29.114 "zcopy": true, 00:07:29.114 "get_zone_info": false, 00:07:29.114 "zone_management": false, 00:07:29.114 "zone_append": false, 00:07:29.114 "compare": false, 00:07:29.114 "compare_and_write": false, 00:07:29.114 "abort": true, 00:07:29.114 "seek_hole": false, 00:07:29.114 "seek_data": false, 00:07:29.114 "copy": true, 00:07:29.114 "nvme_iov_md": false 00:07:29.114 }, 00:07:29.114 "memory_domains": [ 00:07:29.114 { 00:07:29.114 "dma_device_id": "system", 00:07:29.114 "dma_device_type": 1 00:07:29.114 }, 00:07:29.114 { 00:07:29.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:29.114 "dma_device_type": 2 00:07:29.114 } 00:07:29.114 ], 00:07:29.114 "driver_specific": { 00:07:29.114 "passthru": { 00:07:29.114 "name": "Passthru0", 00:07:29.114 "base_bdev_name": "Malloc2" 00:07:29.114 } 00:07:29.114 } 00:07:29.114 } 00:07:29.114 ]' 00:07:29.114 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:29.373 00:07:29.373 real 0m0.289s 00:07:29.373 user 0m0.181s 00:07:29.373 sys 0m0.048s 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:29.373 15:09:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:29.373 ************************************ 00:07:29.373 END TEST rpc_daemon_integrity 00:07:29.373 ************************************ 00:07:29.373 15:09:07 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:07:29.373 15:09:07 rpc -- rpc/rpc.sh@84 -- # killprocess 1462471 00:07:29.373 15:09:07 rpc -- common/autotest_common.sh@954 -- # '[' -z 1462471 ']' 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@958 -- # kill -0 1462471 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@959 -- # uname 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1462471 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1462471' 00:07:29.374 killing process with pid 1462471 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@973 -- # kill 1462471 00:07:29.374 15:09:07 rpc -- common/autotest_common.sh@978 -- # wait 1462471 00:07:29.631 00:07:29.631 real 0m2.061s 00:07:29.631 user 0m2.548s 00:07:29.631 sys 0m0.794s 00:07:29.631 15:09:08 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:29.631 15:09:08 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:29.631 ************************************ 00:07:29.631 END TEST rpc 00:07:29.631 ************************************ 00:07:29.889 15:09:08 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:29.889 15:09:08 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:29.889 15:09:08 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:29.889 15:09:08 -- common/autotest_common.sh@10 -- # set +x 00:07:29.889 ************************************ 00:07:29.889 START TEST skip_rpc 00:07:29.889 ************************************ 00:07:29.889 15:09:08 skip_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:29.889 * Looking for test storage... 00:07:29.889 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:29.889 15:09:08 skip_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:29.889 15:09:08 skip_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:29.889 15:09:08 skip_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:07:29.889 15:09:08 skip_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:29.889 15:09:08 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:29.889 15:09:08 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:29.889 15:09:08 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:29.890 15:09:08 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:07:29.890 15:09:08 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:07:29.890 15:09:08 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:07:29.890 15:09:08 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:07:29.890 15:09:08 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:07:29.890 15:09:08 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@345 -- # : 1 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:30.147 15:09:08 skip_rpc -- scripts/common.sh@368 -- # return 0 00:07:30.147 15:09:08 skip_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:30.147 15:09:08 skip_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:30.147 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.147 --rc genhtml_branch_coverage=1 00:07:30.147 --rc genhtml_function_coverage=1 00:07:30.147 --rc genhtml_legend=1 00:07:30.147 --rc geninfo_all_blocks=1 00:07:30.147 --rc geninfo_unexecuted_blocks=1 00:07:30.147 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.147 ' 00:07:30.147 15:09:08 skip_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:30.147 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.147 --rc genhtml_branch_coverage=1 00:07:30.147 --rc genhtml_function_coverage=1 00:07:30.147 --rc genhtml_legend=1 00:07:30.147 --rc geninfo_all_blocks=1 00:07:30.147 --rc geninfo_unexecuted_blocks=1 00:07:30.147 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.147 ' 00:07:30.147 15:09:08 skip_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:30.147 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.147 --rc genhtml_branch_coverage=1 00:07:30.147 --rc genhtml_function_coverage=1 00:07:30.147 --rc genhtml_legend=1 00:07:30.147 --rc geninfo_all_blocks=1 00:07:30.147 --rc geninfo_unexecuted_blocks=1 00:07:30.147 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.147 ' 00:07:30.147 15:09:08 skip_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:30.147 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.147 --rc genhtml_branch_coverage=1 00:07:30.147 --rc genhtml_function_coverage=1 00:07:30.147 --rc genhtml_legend=1 00:07:30.147 --rc geninfo_all_blocks=1 00:07:30.147 --rc geninfo_unexecuted_blocks=1 00:07:30.147 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.147 ' 00:07:30.147 15:09:08 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:30.147 15:09:08 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:30.147 15:09:08 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:07:30.147 15:09:08 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:30.147 15:09:08 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:30.147 15:09:08 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:30.147 ************************************ 00:07:30.147 START TEST skip_rpc 00:07:30.147 ************************************ 00:07:30.147 15:09:08 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:07:30.147 15:09:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:07:30.147 15:09:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1462968 00:07:30.147 15:09:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:30.147 15:09:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:07:30.147 [2024-11-20 15:09:08.643204] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:30.147 [2024-11-20 15:09:08.643250] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1462968 ] 00:07:30.147 [2024-11-20 15:09:08.724067] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.147 [2024-11-20 15:09:08.748014] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1462968 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 1462968 ']' 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 1462968 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1462968 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1462968' 00:07:35.414 killing process with pid 1462968 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 1462968 00:07:35.414 15:09:13 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 1462968 00:07:35.414 00:07:35.414 real 0m5.388s 00:07:35.414 user 0m5.120s 00:07:35.414 sys 0m0.294s 00:07:35.414 15:09:14 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:35.414 15:09:14 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:35.414 ************************************ 00:07:35.414 END TEST skip_rpc 00:07:35.414 ************************************ 00:07:35.414 15:09:14 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:07:35.414 15:09:14 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:35.414 15:09:14 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:35.414 15:09:14 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:35.414 ************************************ 00:07:35.414 START TEST skip_rpc_with_json 00:07:35.414 ************************************ 00:07:35.414 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:07:35.414 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:07:35.414 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1464089 00:07:35.414 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:35.414 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:35.414 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1464089 00:07:35.674 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 1464089 ']' 00:07:35.674 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.674 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:35.674 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.674 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.674 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:35.674 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:35.674 [2024-11-20 15:09:14.123212] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:35.674 [2024-11-20 15:09:14.123279] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1464089 ] 00:07:35.674 [2024-11-20 15:09:14.208977] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.674 [2024-11-20 15:09:14.235738] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:35.934 [2024-11-20 15:09:14.445161] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:07:35.934 request: 00:07:35.934 { 00:07:35.934 "trtype": "tcp", 00:07:35.934 "method": "nvmf_get_transports", 00:07:35.934 "req_id": 1 00:07:35.934 } 00:07:35.934 Got JSON-RPC error response 00:07:35.934 response: 00:07:35.934 { 00:07:35.934 "code": -19, 00:07:35.934 "message": "No such device" 00:07:35.934 } 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:35.934 [2024-11-20 15:09:14.457252] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:35.934 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:36.194 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:36.194 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:36.194 { 00:07:36.194 "subsystems": [ 00:07:36.194 { 00:07:36.194 "subsystem": "scheduler", 00:07:36.194 "config": [ 00:07:36.194 { 00:07:36.194 "method": "framework_set_scheduler", 00:07:36.194 "params": { 00:07:36.194 "name": "static" 00:07:36.194 } 00:07:36.194 } 00:07:36.194 ] 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "vmd", 00:07:36.194 "config": [] 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "sock", 00:07:36.194 "config": [ 00:07:36.194 { 00:07:36.194 "method": "sock_set_default_impl", 00:07:36.194 "params": { 00:07:36.194 "impl_name": "posix" 00:07:36.194 } 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "method": "sock_impl_set_options", 00:07:36.194 "params": { 00:07:36.194 "impl_name": "ssl", 00:07:36.194 "recv_buf_size": 4096, 00:07:36.194 "send_buf_size": 4096, 00:07:36.194 "enable_recv_pipe": true, 00:07:36.194 "enable_quickack": false, 00:07:36.194 "enable_placement_id": 0, 00:07:36.194 "enable_zerocopy_send_server": true, 00:07:36.194 "enable_zerocopy_send_client": false, 00:07:36.194 "zerocopy_threshold": 0, 00:07:36.194 "tls_version": 0, 00:07:36.194 "enable_ktls": false 00:07:36.194 } 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "method": "sock_impl_set_options", 00:07:36.194 "params": { 00:07:36.194 "impl_name": "posix", 00:07:36.194 "recv_buf_size": 2097152, 00:07:36.194 "send_buf_size": 2097152, 00:07:36.194 "enable_recv_pipe": true, 00:07:36.194 "enable_quickack": false, 00:07:36.194 "enable_placement_id": 0, 00:07:36.194 "enable_zerocopy_send_server": true, 00:07:36.194 "enable_zerocopy_send_client": false, 00:07:36.194 "zerocopy_threshold": 0, 00:07:36.194 "tls_version": 0, 00:07:36.194 "enable_ktls": false 00:07:36.194 } 00:07:36.194 } 00:07:36.194 ] 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "iobuf", 00:07:36.194 "config": [ 00:07:36.194 { 00:07:36.194 "method": "iobuf_set_options", 00:07:36.194 "params": { 00:07:36.194 "small_pool_count": 8192, 00:07:36.194 "large_pool_count": 1024, 00:07:36.194 "small_bufsize": 8192, 00:07:36.194 "large_bufsize": 135168, 00:07:36.194 "enable_numa": false 00:07:36.194 } 00:07:36.194 } 00:07:36.194 ] 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "keyring", 00:07:36.194 "config": [] 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "vfio_user_target", 00:07:36.194 "config": null 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "fsdev", 00:07:36.194 "config": [ 00:07:36.194 { 00:07:36.194 "method": "fsdev_set_opts", 00:07:36.194 "params": { 00:07:36.194 "fsdev_io_pool_size": 65535, 00:07:36.194 "fsdev_io_cache_size": 256 00:07:36.194 } 00:07:36.194 } 00:07:36.194 ] 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "accel", 00:07:36.194 "config": [ 00:07:36.194 { 00:07:36.194 "method": "accel_set_options", 00:07:36.194 "params": { 00:07:36.194 "small_cache_size": 128, 00:07:36.194 "large_cache_size": 16, 00:07:36.194 "task_count": 2048, 00:07:36.194 "sequence_count": 2048, 00:07:36.194 "buf_count": 2048 00:07:36.194 } 00:07:36.194 } 00:07:36.194 ] 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "bdev", 00:07:36.194 "config": [ 00:07:36.194 { 00:07:36.194 "method": "bdev_set_options", 00:07:36.194 "params": { 00:07:36.194 "bdev_io_pool_size": 65535, 00:07:36.194 "bdev_io_cache_size": 256, 00:07:36.194 "bdev_auto_examine": true, 00:07:36.194 "iobuf_small_cache_size": 128, 00:07:36.194 "iobuf_large_cache_size": 16 00:07:36.194 } 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "method": "bdev_raid_set_options", 00:07:36.194 "params": { 00:07:36.194 "process_window_size_kb": 1024, 00:07:36.194 "process_max_bandwidth_mb_sec": 0 00:07:36.194 } 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "method": "bdev_nvme_set_options", 00:07:36.194 "params": { 00:07:36.194 "action_on_timeout": "none", 00:07:36.194 "timeout_us": 0, 00:07:36.194 "timeout_admin_us": 0, 00:07:36.194 "keep_alive_timeout_ms": 10000, 00:07:36.194 "arbitration_burst": 0, 00:07:36.194 "low_priority_weight": 0, 00:07:36.194 "medium_priority_weight": 0, 00:07:36.194 "high_priority_weight": 0, 00:07:36.194 "nvme_adminq_poll_period_us": 10000, 00:07:36.194 "nvme_ioq_poll_period_us": 0, 00:07:36.194 "io_queue_requests": 0, 00:07:36.194 "delay_cmd_submit": true, 00:07:36.194 "transport_retry_count": 4, 00:07:36.194 "bdev_retry_count": 3, 00:07:36.194 "transport_ack_timeout": 0, 00:07:36.194 "ctrlr_loss_timeout_sec": 0, 00:07:36.194 "reconnect_delay_sec": 0, 00:07:36.194 "fast_io_fail_timeout_sec": 0, 00:07:36.194 "disable_auto_failback": false, 00:07:36.194 "generate_uuids": false, 00:07:36.194 "transport_tos": 0, 00:07:36.194 "nvme_error_stat": false, 00:07:36.194 "rdma_srq_size": 0, 00:07:36.194 "io_path_stat": false, 00:07:36.194 "allow_accel_sequence": false, 00:07:36.194 "rdma_max_cq_size": 0, 00:07:36.194 "rdma_cm_event_timeout_ms": 0, 00:07:36.194 "dhchap_digests": [ 00:07:36.194 "sha256", 00:07:36.194 "sha384", 00:07:36.194 "sha512" 00:07:36.194 ], 00:07:36.194 "dhchap_dhgroups": [ 00:07:36.194 "null", 00:07:36.194 "ffdhe2048", 00:07:36.194 "ffdhe3072", 00:07:36.194 "ffdhe4096", 00:07:36.194 "ffdhe6144", 00:07:36.194 "ffdhe8192" 00:07:36.194 ] 00:07:36.194 } 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "method": "bdev_nvme_set_hotplug", 00:07:36.194 "params": { 00:07:36.194 "period_us": 100000, 00:07:36.194 "enable": false 00:07:36.194 } 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "method": "bdev_iscsi_set_options", 00:07:36.194 "params": { 00:07:36.194 "timeout_sec": 30 00:07:36.194 } 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "method": "bdev_wait_for_examine" 00:07:36.194 } 00:07:36.194 ] 00:07:36.194 }, 00:07:36.194 { 00:07:36.194 "subsystem": "nvmf", 00:07:36.194 "config": [ 00:07:36.194 { 00:07:36.194 "method": "nvmf_set_config", 00:07:36.194 "params": { 00:07:36.194 "discovery_filter": "match_any", 00:07:36.194 "admin_cmd_passthru": { 00:07:36.194 "identify_ctrlr": false 00:07:36.194 }, 00:07:36.194 "dhchap_digests": [ 00:07:36.194 "sha256", 00:07:36.194 "sha384", 00:07:36.194 "sha512" 00:07:36.194 ], 00:07:36.194 "dhchap_dhgroups": [ 00:07:36.194 "null", 00:07:36.195 "ffdhe2048", 00:07:36.195 "ffdhe3072", 00:07:36.195 "ffdhe4096", 00:07:36.195 "ffdhe6144", 00:07:36.195 "ffdhe8192" 00:07:36.195 ] 00:07:36.195 } 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "method": "nvmf_set_max_subsystems", 00:07:36.195 "params": { 00:07:36.195 "max_subsystems": 1024 00:07:36.195 } 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "method": "nvmf_set_crdt", 00:07:36.195 "params": { 00:07:36.195 "crdt1": 0, 00:07:36.195 "crdt2": 0, 00:07:36.195 "crdt3": 0 00:07:36.195 } 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "method": "nvmf_create_transport", 00:07:36.195 "params": { 00:07:36.195 "trtype": "TCP", 00:07:36.195 "max_queue_depth": 128, 00:07:36.195 "max_io_qpairs_per_ctrlr": 127, 00:07:36.195 "in_capsule_data_size": 4096, 00:07:36.195 "max_io_size": 131072, 00:07:36.195 "io_unit_size": 131072, 00:07:36.195 "max_aq_depth": 128, 00:07:36.195 "num_shared_buffers": 511, 00:07:36.195 "buf_cache_size": 4294967295, 00:07:36.195 "dif_insert_or_strip": false, 00:07:36.195 "zcopy": false, 00:07:36.195 "c2h_success": true, 00:07:36.195 "sock_priority": 0, 00:07:36.195 "abort_timeout_sec": 1, 00:07:36.195 "ack_timeout": 0, 00:07:36.195 "data_wr_pool_size": 0 00:07:36.195 } 00:07:36.195 } 00:07:36.195 ] 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "subsystem": "nbd", 00:07:36.195 "config": [] 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "subsystem": "ublk", 00:07:36.195 "config": [] 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "subsystem": "vhost_blk", 00:07:36.195 "config": [] 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "subsystem": "scsi", 00:07:36.195 "config": null 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "subsystem": "iscsi", 00:07:36.195 "config": [ 00:07:36.195 { 00:07:36.195 "method": "iscsi_set_options", 00:07:36.195 "params": { 00:07:36.195 "node_base": "iqn.2016-06.io.spdk", 00:07:36.195 "max_sessions": 128, 00:07:36.195 "max_connections_per_session": 2, 00:07:36.195 "max_queue_depth": 64, 00:07:36.195 "default_time2wait": 2, 00:07:36.195 "default_time2retain": 20, 00:07:36.195 "first_burst_length": 8192, 00:07:36.195 "immediate_data": true, 00:07:36.195 "allow_duplicated_isid": false, 00:07:36.195 "error_recovery_level": 0, 00:07:36.195 "nop_timeout": 60, 00:07:36.195 "nop_in_interval": 30, 00:07:36.195 "disable_chap": false, 00:07:36.195 "require_chap": false, 00:07:36.195 "mutual_chap": false, 00:07:36.195 "chap_group": 0, 00:07:36.195 "max_large_datain_per_connection": 64, 00:07:36.195 "max_r2t_per_connection": 4, 00:07:36.195 "pdu_pool_size": 36864, 00:07:36.195 "immediate_data_pool_size": 16384, 00:07:36.195 "data_out_pool_size": 2048 00:07:36.195 } 00:07:36.195 } 00:07:36.195 ] 00:07:36.195 }, 00:07:36.195 { 00:07:36.195 "subsystem": "vhost_scsi", 00:07:36.195 "config": [] 00:07:36.195 } 00:07:36.195 ] 00:07:36.195 } 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1464089 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 1464089 ']' 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 1464089 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1464089 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1464089' 00:07:36.195 killing process with pid 1464089 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 1464089 00:07:36.195 15:09:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 1464089 00:07:36.454 15:09:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1464271 00:07:36.454 15:09:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:36.454 15:09:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1464271 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 1464271 ']' 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 1464271 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1464271 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1464271' 00:07:41.724 killing process with pid 1464271 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 1464271 00:07:41.724 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 1464271 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:41.983 00:07:41.983 real 0m6.332s 00:07:41.983 user 0m5.972s 00:07:41.983 sys 0m0.688s 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:41.983 ************************************ 00:07:41.983 END TEST skip_rpc_with_json 00:07:41.983 ************************************ 00:07:41.983 15:09:20 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:41.983 15:09:20 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:41.983 15:09:20 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:41.983 15:09:20 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.983 ************************************ 00:07:41.983 START TEST skip_rpc_with_delay 00:07:41.983 ************************************ 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:41.983 [2024-11-20 15:09:20.547345] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:41.983 00:07:41.983 real 0m0.048s 00:07:41.983 user 0m0.020s 00:07:41.983 sys 0m0.028s 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:41.983 15:09:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:41.983 ************************************ 00:07:41.983 END TEST skip_rpc_with_delay 00:07:41.983 ************************************ 00:07:41.983 15:09:20 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:41.983 15:09:20 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:41.983 15:09:20 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:41.983 15:09:20 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:41.983 15:09:20 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:41.983 15:09:20 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.983 ************************************ 00:07:41.983 START TEST exit_on_failed_rpc_init 00:07:41.983 ************************************ 00:07:41.983 15:09:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:07:41.983 15:09:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1465023 00:07:41.983 15:09:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1465023 00:07:41.983 15:09:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:41.983 15:09:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 1465023 ']' 00:07:41.983 15:09:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:41.983 15:09:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:41.983 15:09:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:41.984 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:41.984 15:09:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:41.984 15:09:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:42.241 [2024-11-20 15:09:20.675408] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:42.241 [2024-11-20 15:09:20.675495] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1465023 ] 00:07:42.241 [2024-11-20 15:09:20.762110] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.241 [2024-11-20 15:09:20.788078] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:42.498 [2024-11-20 15:09:21.024666] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:42.498 [2024-11-20 15:09:21.024733] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1465038 ] 00:07:42.498 [2024-11-20 15:09:21.103382] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.498 [2024-11-20 15:09:21.128650] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:42.498 [2024-11-20 15:09:21.128733] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:42.498 [2024-11-20 15:09:21.128746] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:42.498 [2024-11-20 15:09:21.128754] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1465023 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 1465023 ']' 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 1465023 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:42.498 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1465023 00:07:42.756 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:42.756 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:42.756 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1465023' 00:07:42.756 killing process with pid 1465023 00:07:42.756 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 1465023 00:07:42.756 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 1465023 00:07:43.014 00:07:43.014 real 0m0.896s 00:07:43.014 user 0m0.856s 00:07:43.014 sys 0m0.443s 00:07:43.014 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:43.014 15:09:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:43.014 ************************************ 00:07:43.014 END TEST exit_on_failed_rpc_init 00:07:43.014 ************************************ 00:07:43.014 15:09:21 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:43.014 00:07:43.014 real 0m13.191s 00:07:43.014 user 0m12.181s 00:07:43.014 sys 0m1.801s 00:07:43.014 15:09:21 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:43.014 15:09:21 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:43.014 ************************************ 00:07:43.014 END TEST skip_rpc 00:07:43.014 ************************************ 00:07:43.014 15:09:21 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:43.014 15:09:21 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:43.014 15:09:21 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:43.014 15:09:21 -- common/autotest_common.sh@10 -- # set +x 00:07:43.014 ************************************ 00:07:43.014 START TEST rpc_client 00:07:43.014 ************************************ 00:07:43.014 15:09:21 rpc_client -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:43.273 * Looking for test storage... 00:07:43.273 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1693 -- # lcov --version 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@345 -- # : 1 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@353 -- # local d=1 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@355 -- # echo 1 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@353 -- # local d=2 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@355 -- # echo 2 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:43.273 15:09:21 rpc_client -- scripts/common.sh@368 -- # return 0 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:43.273 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.273 --rc genhtml_branch_coverage=1 00:07:43.273 --rc genhtml_function_coverage=1 00:07:43.273 --rc genhtml_legend=1 00:07:43.273 --rc geninfo_all_blocks=1 00:07:43.273 --rc geninfo_unexecuted_blocks=1 00:07:43.273 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.273 ' 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:43.273 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.273 --rc genhtml_branch_coverage=1 00:07:43.273 --rc genhtml_function_coverage=1 00:07:43.273 --rc genhtml_legend=1 00:07:43.273 --rc geninfo_all_blocks=1 00:07:43.273 --rc geninfo_unexecuted_blocks=1 00:07:43.273 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.273 ' 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:43.273 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.273 --rc genhtml_branch_coverage=1 00:07:43.273 --rc genhtml_function_coverage=1 00:07:43.273 --rc genhtml_legend=1 00:07:43.273 --rc geninfo_all_blocks=1 00:07:43.273 --rc geninfo_unexecuted_blocks=1 00:07:43.273 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.273 ' 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:43.273 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.273 --rc genhtml_branch_coverage=1 00:07:43.273 --rc genhtml_function_coverage=1 00:07:43.273 --rc genhtml_legend=1 00:07:43.273 --rc geninfo_all_blocks=1 00:07:43.273 --rc geninfo_unexecuted_blocks=1 00:07:43.273 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.273 ' 00:07:43.273 15:09:21 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:43.273 OK 00:07:43.273 15:09:21 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:43.273 00:07:43.273 real 0m0.203s 00:07:43.273 user 0m0.119s 00:07:43.273 sys 0m0.095s 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:43.273 15:09:21 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:43.273 ************************************ 00:07:43.273 END TEST rpc_client 00:07:43.273 ************************************ 00:07:43.273 15:09:21 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:43.273 15:09:21 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:43.273 15:09:21 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:43.273 15:09:21 -- common/autotest_common.sh@10 -- # set +x 00:07:43.273 ************************************ 00:07:43.273 START TEST json_config 00:07:43.273 ************************************ 00:07:43.273 15:09:21 json_config -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1693 -- # lcov --version 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:43.533 15:09:22 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:43.533 15:09:22 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:43.533 15:09:22 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:43.533 15:09:22 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:07:43.533 15:09:22 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:07:43.533 15:09:22 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:07:43.533 15:09:22 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:07:43.533 15:09:22 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:07:43.533 15:09:22 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:07:43.533 15:09:22 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:07:43.533 15:09:22 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:43.533 15:09:22 json_config -- scripts/common.sh@344 -- # case "$op" in 00:07:43.533 15:09:22 json_config -- scripts/common.sh@345 -- # : 1 00:07:43.533 15:09:22 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:43.533 15:09:22 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:43.533 15:09:22 json_config -- scripts/common.sh@365 -- # decimal 1 00:07:43.533 15:09:22 json_config -- scripts/common.sh@353 -- # local d=1 00:07:43.533 15:09:22 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:43.533 15:09:22 json_config -- scripts/common.sh@355 -- # echo 1 00:07:43.533 15:09:22 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:07:43.533 15:09:22 json_config -- scripts/common.sh@366 -- # decimal 2 00:07:43.533 15:09:22 json_config -- scripts/common.sh@353 -- # local d=2 00:07:43.533 15:09:22 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:43.533 15:09:22 json_config -- scripts/common.sh@355 -- # echo 2 00:07:43.533 15:09:22 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:07:43.533 15:09:22 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:43.533 15:09:22 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:43.533 15:09:22 json_config -- scripts/common.sh@368 -- # return 0 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:43.533 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.533 --rc genhtml_branch_coverage=1 00:07:43.533 --rc genhtml_function_coverage=1 00:07:43.533 --rc genhtml_legend=1 00:07:43.533 --rc geninfo_all_blocks=1 00:07:43.533 --rc geninfo_unexecuted_blocks=1 00:07:43.533 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.533 ' 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:43.533 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.533 --rc genhtml_branch_coverage=1 00:07:43.533 --rc genhtml_function_coverage=1 00:07:43.533 --rc genhtml_legend=1 00:07:43.533 --rc geninfo_all_blocks=1 00:07:43.533 --rc geninfo_unexecuted_blocks=1 00:07:43.533 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.533 ' 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:43.533 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.533 --rc genhtml_branch_coverage=1 00:07:43.533 --rc genhtml_function_coverage=1 00:07:43.533 --rc genhtml_legend=1 00:07:43.533 --rc geninfo_all_blocks=1 00:07:43.533 --rc geninfo_unexecuted_blocks=1 00:07:43.533 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.533 ' 00:07:43.533 15:09:22 json_config -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:43.533 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.533 --rc genhtml_branch_coverage=1 00:07:43.533 --rc genhtml_function_coverage=1 00:07:43.533 --rc genhtml_legend=1 00:07:43.533 --rc geninfo_all_blocks=1 00:07:43.534 --rc geninfo_unexecuted_blocks=1 00:07:43.534 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.534 ' 00:07:43.534 15:09:22 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:43.534 15:09:22 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:07:43.534 15:09:22 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:43.534 15:09:22 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:43.534 15:09:22 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:43.534 15:09:22 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:43.534 15:09:22 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:43.534 15:09:22 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:43.534 15:09:22 json_config -- paths/export.sh@5 -- # export PATH 00:07:43.534 15:09:22 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@51 -- # : 0 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:43.534 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:43.534 15:09:22 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:43.534 15:09:22 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:43.534 15:09:22 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:43.534 15:09:22 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:43.534 15:09:22 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:43.534 15:09:22 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:43.534 15:09:22 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:07:43.534 WARNING: No tests are enabled so not running JSON configuration tests 00:07:43.534 15:09:22 json_config -- json_config/json_config.sh@28 -- # exit 0 00:07:43.534 00:07:43.534 real 0m0.202s 00:07:43.534 user 0m0.115s 00:07:43.534 sys 0m0.092s 00:07:43.534 15:09:22 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:43.534 15:09:22 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:43.534 ************************************ 00:07:43.534 END TEST json_config 00:07:43.534 ************************************ 00:07:43.534 15:09:22 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:43.534 15:09:22 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:43.534 15:09:22 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:43.534 15:09:22 -- common/autotest_common.sh@10 -- # set +x 00:07:43.793 ************************************ 00:07:43.793 START TEST json_config_extra_key 00:07:43.793 ************************************ 00:07:43.793 15:09:22 json_config_extra_key -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:43.793 15:09:22 json_config_extra_key -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:43.793 15:09:22 json_config_extra_key -- common/autotest_common.sh@1693 -- # lcov --version 00:07:43.793 15:09:22 json_config_extra_key -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:43.793 15:09:22 json_config_extra_key -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:43.793 15:09:22 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:43.794 15:09:22 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:43.794 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.794 --rc genhtml_branch_coverage=1 00:07:43.794 --rc genhtml_function_coverage=1 00:07:43.794 --rc genhtml_legend=1 00:07:43.794 --rc geninfo_all_blocks=1 00:07:43.794 --rc geninfo_unexecuted_blocks=1 00:07:43.794 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.794 ' 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:43.794 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.794 --rc genhtml_branch_coverage=1 00:07:43.794 --rc genhtml_function_coverage=1 00:07:43.794 --rc genhtml_legend=1 00:07:43.794 --rc geninfo_all_blocks=1 00:07:43.794 --rc geninfo_unexecuted_blocks=1 00:07:43.794 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.794 ' 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:43.794 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.794 --rc genhtml_branch_coverage=1 00:07:43.794 --rc genhtml_function_coverage=1 00:07:43.794 --rc genhtml_legend=1 00:07:43.794 --rc geninfo_all_blocks=1 00:07:43.794 --rc geninfo_unexecuted_blocks=1 00:07:43.794 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.794 ' 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:43.794 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.794 --rc genhtml_branch_coverage=1 00:07:43.794 --rc genhtml_function_coverage=1 00:07:43.794 --rc genhtml_legend=1 00:07:43.794 --rc geninfo_all_blocks=1 00:07:43.794 --rc geninfo_unexecuted_blocks=1 00:07:43.794 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.794 ' 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:43.794 15:09:22 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:07:43.794 15:09:22 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:43.794 15:09:22 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:43.794 15:09:22 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:43.794 15:09:22 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:43.794 15:09:22 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:43.794 15:09:22 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:43.794 15:09:22 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:43.794 15:09:22 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:43.794 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:43.794 15:09:22 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:43.794 INFO: launching applications... 00:07:43.794 15:09:22 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1465378 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:43.794 Waiting for target to run... 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1465378 /var/tmp/spdk_tgt.sock 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 1465378 ']' 00:07:43.794 15:09:22 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:43.794 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:43.794 15:09:22 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:43.794 [2024-11-20 15:09:22.451477] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:43.794 [2024-11-20 15:09:22.451546] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1465378 ] 00:07:44.359 [2024-11-20 15:09:22.754248] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.359 [2024-11-20 15:09:22.768639] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.925 15:09:23 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:44.925 15:09:23 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:44.925 00:07:44.925 15:09:23 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:44.925 INFO: shutting down applications... 00:07:44.925 15:09:23 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1465378 ]] 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1465378 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1465378 00:07:44.925 15:09:23 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:45.183 15:09:23 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:45.183 15:09:23 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:45.183 15:09:23 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1465378 00:07:45.183 15:09:23 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:45.183 15:09:23 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:45.183 15:09:23 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:45.183 15:09:23 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:45.183 SPDK target shutdown done 00:07:45.183 15:09:23 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:45.183 Success 00:07:45.183 00:07:45.183 real 0m1.586s 00:07:45.183 user 0m1.349s 00:07:45.183 sys 0m0.463s 00:07:45.183 15:09:23 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.183 15:09:23 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:45.183 ************************************ 00:07:45.183 END TEST json_config_extra_key 00:07:45.183 ************************************ 00:07:45.442 15:09:23 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:45.442 15:09:23 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:45.442 15:09:23 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.442 15:09:23 -- common/autotest_common.sh@10 -- # set +x 00:07:45.442 ************************************ 00:07:45.442 START TEST alias_rpc 00:07:45.442 ************************************ 00:07:45.442 15:09:23 alias_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:45.442 * Looking for test storage... 00:07:45.442 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@345 -- # : 1 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:45.442 15:09:24 alias_rpc -- scripts/common.sh@368 -- # return 0 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:45.442 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.442 --rc genhtml_branch_coverage=1 00:07:45.442 --rc genhtml_function_coverage=1 00:07:45.442 --rc genhtml_legend=1 00:07:45.442 --rc geninfo_all_blocks=1 00:07:45.442 --rc geninfo_unexecuted_blocks=1 00:07:45.442 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.442 ' 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:45.442 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.442 --rc genhtml_branch_coverage=1 00:07:45.442 --rc genhtml_function_coverage=1 00:07:45.442 --rc genhtml_legend=1 00:07:45.442 --rc geninfo_all_blocks=1 00:07:45.442 --rc geninfo_unexecuted_blocks=1 00:07:45.442 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.442 ' 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:45.442 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.442 --rc genhtml_branch_coverage=1 00:07:45.442 --rc genhtml_function_coverage=1 00:07:45.442 --rc genhtml_legend=1 00:07:45.442 --rc geninfo_all_blocks=1 00:07:45.442 --rc geninfo_unexecuted_blocks=1 00:07:45.442 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.442 ' 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:45.442 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.442 --rc genhtml_branch_coverage=1 00:07:45.442 --rc genhtml_function_coverage=1 00:07:45.442 --rc genhtml_legend=1 00:07:45.442 --rc geninfo_all_blocks=1 00:07:45.442 --rc geninfo_unexecuted_blocks=1 00:07:45.442 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.442 ' 00:07:45.442 15:09:24 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:45.442 15:09:24 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1465753 00:07:45.442 15:09:24 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:45.442 15:09:24 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1465753 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 1465753 ']' 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:45.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:45.442 15:09:24 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:45.700 [2024-11-20 15:09:24.134913] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:45.700 [2024-11-20 15:09:24.134976] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1465753 ] 00:07:45.700 [2024-11-20 15:09:24.229583] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.700 [2024-11-20 15:09:24.261452] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.957 15:09:24 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:45.957 15:09:24 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:45.957 15:09:24 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:46.214 15:09:24 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1465753 00:07:46.214 15:09:24 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 1465753 ']' 00:07:46.214 15:09:24 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 1465753 00:07:46.214 15:09:24 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:07:46.214 15:09:24 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:46.214 15:09:24 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1465753 00:07:46.215 15:09:24 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:46.215 15:09:24 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:46.215 15:09:24 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1465753' 00:07:46.215 killing process with pid 1465753 00:07:46.215 15:09:24 alias_rpc -- common/autotest_common.sh@973 -- # kill 1465753 00:07:46.215 15:09:24 alias_rpc -- common/autotest_common.sh@978 -- # wait 1465753 00:07:46.473 00:07:46.473 real 0m1.153s 00:07:46.473 user 0m1.076s 00:07:46.473 sys 0m0.508s 00:07:46.473 15:09:25 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:46.473 15:09:25 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:46.473 ************************************ 00:07:46.473 END TEST alias_rpc 00:07:46.473 ************************************ 00:07:46.473 15:09:25 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:07:46.473 15:09:25 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:46.473 15:09:25 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:46.473 15:09:25 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:46.473 15:09:25 -- common/autotest_common.sh@10 -- # set +x 00:07:46.473 ************************************ 00:07:46.473 START TEST spdkcli_tcp 00:07:46.473 ************************************ 00:07:46.473 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:46.731 * Looking for test storage... 00:07:46.731 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:07:46.731 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:46.731 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lcov --version 00:07:46.731 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:46.731 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:46.731 15:09:25 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:07:46.731 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:46.731 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:46.731 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.731 --rc genhtml_branch_coverage=1 00:07:46.731 --rc genhtml_function_coverage=1 00:07:46.731 --rc genhtml_legend=1 00:07:46.731 --rc geninfo_all_blocks=1 00:07:46.731 --rc geninfo_unexecuted_blocks=1 00:07:46.731 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.731 ' 00:07:46.731 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:46.731 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.731 --rc genhtml_branch_coverage=1 00:07:46.731 --rc genhtml_function_coverage=1 00:07:46.731 --rc genhtml_legend=1 00:07:46.731 --rc geninfo_all_blocks=1 00:07:46.731 --rc geninfo_unexecuted_blocks=1 00:07:46.731 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.731 ' 00:07:46.731 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:46.731 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.731 --rc genhtml_branch_coverage=1 00:07:46.731 --rc genhtml_function_coverage=1 00:07:46.731 --rc genhtml_legend=1 00:07:46.731 --rc geninfo_all_blocks=1 00:07:46.731 --rc geninfo_unexecuted_blocks=1 00:07:46.731 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.731 ' 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:46.732 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.732 --rc genhtml_branch_coverage=1 00:07:46.732 --rc genhtml_function_coverage=1 00:07:46.732 --rc genhtml_legend=1 00:07:46.732 --rc geninfo_all_blocks=1 00:07:46.732 --rc geninfo_unexecuted_blocks=1 00:07:46.732 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.732 ' 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1465934 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1465934 00:07:46.732 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 1465934 ']' 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:46.732 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:46.732 15:09:25 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:46.732 [2024-11-20 15:09:25.373153] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:46.732 [2024-11-20 15:09:25.373243] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1465934 ] 00:07:46.989 [2024-11-20 15:09:25.461903] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:46.989 [2024-11-20 15:09:25.490812] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:46.989 [2024-11-20 15:09:25.490813] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.248 15:09:25 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:47.248 15:09:25 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:07:47.248 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1466020 00:07:47.248 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:47.248 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:47.248 [ 00:07:47.248 "spdk_get_version", 00:07:47.248 "rpc_get_methods", 00:07:47.248 "notify_get_notifications", 00:07:47.248 "notify_get_types", 00:07:47.248 "trace_get_info", 00:07:47.248 "trace_get_tpoint_group_mask", 00:07:47.248 "trace_disable_tpoint_group", 00:07:47.248 "trace_enable_tpoint_group", 00:07:47.248 "trace_clear_tpoint_mask", 00:07:47.248 "trace_set_tpoint_mask", 00:07:47.249 "fsdev_set_opts", 00:07:47.249 "fsdev_get_opts", 00:07:47.249 "framework_get_pci_devices", 00:07:47.249 "framework_get_config", 00:07:47.249 "framework_get_subsystems", 00:07:47.249 "vfu_tgt_set_base_path", 00:07:47.249 "keyring_get_keys", 00:07:47.249 "iobuf_get_stats", 00:07:47.249 "iobuf_set_options", 00:07:47.249 "sock_get_default_impl", 00:07:47.249 "sock_set_default_impl", 00:07:47.249 "sock_impl_set_options", 00:07:47.249 "sock_impl_get_options", 00:07:47.249 "vmd_rescan", 00:07:47.249 "vmd_remove_device", 00:07:47.249 "vmd_enable", 00:07:47.249 "accel_get_stats", 00:07:47.249 "accel_set_options", 00:07:47.249 "accel_set_driver", 00:07:47.249 "accel_crypto_key_destroy", 00:07:47.249 "accel_crypto_keys_get", 00:07:47.249 "accel_crypto_key_create", 00:07:47.249 "accel_assign_opc", 00:07:47.249 "accel_get_module_info", 00:07:47.249 "accel_get_opc_assignments", 00:07:47.249 "bdev_get_histogram", 00:07:47.249 "bdev_enable_histogram", 00:07:47.249 "bdev_set_qos_limit", 00:07:47.249 "bdev_set_qd_sampling_period", 00:07:47.249 "bdev_get_bdevs", 00:07:47.249 "bdev_reset_iostat", 00:07:47.249 "bdev_get_iostat", 00:07:47.249 "bdev_examine", 00:07:47.249 "bdev_wait_for_examine", 00:07:47.249 "bdev_set_options", 00:07:47.249 "scsi_get_devices", 00:07:47.249 "thread_set_cpumask", 00:07:47.249 "scheduler_set_options", 00:07:47.249 "framework_get_governor", 00:07:47.249 "framework_get_scheduler", 00:07:47.249 "framework_set_scheduler", 00:07:47.249 "framework_get_reactors", 00:07:47.249 "thread_get_io_channels", 00:07:47.249 "thread_get_pollers", 00:07:47.249 "thread_get_stats", 00:07:47.249 "framework_monitor_context_switch", 00:07:47.249 "spdk_kill_instance", 00:07:47.249 "log_enable_timestamps", 00:07:47.249 "log_get_flags", 00:07:47.249 "log_clear_flag", 00:07:47.249 "log_set_flag", 00:07:47.249 "log_get_level", 00:07:47.249 "log_set_level", 00:07:47.249 "log_get_print_level", 00:07:47.249 "log_set_print_level", 00:07:47.249 "framework_enable_cpumask_locks", 00:07:47.249 "framework_disable_cpumask_locks", 00:07:47.249 "framework_wait_init", 00:07:47.249 "framework_start_init", 00:07:47.249 "virtio_blk_create_transport", 00:07:47.249 "virtio_blk_get_transports", 00:07:47.249 "vhost_controller_set_coalescing", 00:07:47.249 "vhost_get_controllers", 00:07:47.249 "vhost_delete_controller", 00:07:47.249 "vhost_create_blk_controller", 00:07:47.249 "vhost_scsi_controller_remove_target", 00:07:47.249 "vhost_scsi_controller_add_target", 00:07:47.249 "vhost_start_scsi_controller", 00:07:47.249 "vhost_create_scsi_controller", 00:07:47.249 "ublk_recover_disk", 00:07:47.249 "ublk_get_disks", 00:07:47.249 "ublk_stop_disk", 00:07:47.249 "ublk_start_disk", 00:07:47.249 "ublk_destroy_target", 00:07:47.249 "ublk_create_target", 00:07:47.249 "nbd_get_disks", 00:07:47.249 "nbd_stop_disk", 00:07:47.249 "nbd_start_disk", 00:07:47.249 "env_dpdk_get_mem_stats", 00:07:47.249 "nvmf_stop_mdns_prr", 00:07:47.249 "nvmf_publish_mdns_prr", 00:07:47.249 "nvmf_subsystem_get_listeners", 00:07:47.249 "nvmf_subsystem_get_qpairs", 00:07:47.249 "nvmf_subsystem_get_controllers", 00:07:47.249 "nvmf_get_stats", 00:07:47.249 "nvmf_get_transports", 00:07:47.249 "nvmf_create_transport", 00:07:47.249 "nvmf_get_targets", 00:07:47.249 "nvmf_delete_target", 00:07:47.249 "nvmf_create_target", 00:07:47.249 "nvmf_subsystem_allow_any_host", 00:07:47.249 "nvmf_subsystem_set_keys", 00:07:47.249 "nvmf_subsystem_remove_host", 00:07:47.249 "nvmf_subsystem_add_host", 00:07:47.249 "nvmf_ns_remove_host", 00:07:47.249 "nvmf_ns_add_host", 00:07:47.249 "nvmf_subsystem_remove_ns", 00:07:47.249 "nvmf_subsystem_set_ns_ana_group", 00:07:47.249 "nvmf_subsystem_add_ns", 00:07:47.249 "nvmf_subsystem_listener_set_ana_state", 00:07:47.249 "nvmf_discovery_get_referrals", 00:07:47.249 "nvmf_discovery_remove_referral", 00:07:47.249 "nvmf_discovery_add_referral", 00:07:47.249 "nvmf_subsystem_remove_listener", 00:07:47.249 "nvmf_subsystem_add_listener", 00:07:47.249 "nvmf_delete_subsystem", 00:07:47.249 "nvmf_create_subsystem", 00:07:47.249 "nvmf_get_subsystems", 00:07:47.249 "nvmf_set_crdt", 00:07:47.249 "nvmf_set_config", 00:07:47.249 "nvmf_set_max_subsystems", 00:07:47.249 "iscsi_get_histogram", 00:07:47.249 "iscsi_enable_histogram", 00:07:47.249 "iscsi_set_options", 00:07:47.249 "iscsi_get_auth_groups", 00:07:47.249 "iscsi_auth_group_remove_secret", 00:07:47.249 "iscsi_auth_group_add_secret", 00:07:47.249 "iscsi_delete_auth_group", 00:07:47.249 "iscsi_create_auth_group", 00:07:47.249 "iscsi_set_discovery_auth", 00:07:47.249 "iscsi_get_options", 00:07:47.249 "iscsi_target_node_request_logout", 00:07:47.249 "iscsi_target_node_set_redirect", 00:07:47.249 "iscsi_target_node_set_auth", 00:07:47.249 "iscsi_target_node_add_lun", 00:07:47.249 "iscsi_get_stats", 00:07:47.249 "iscsi_get_connections", 00:07:47.249 "iscsi_portal_group_set_auth", 00:07:47.249 "iscsi_start_portal_group", 00:07:47.249 "iscsi_delete_portal_group", 00:07:47.249 "iscsi_create_portal_group", 00:07:47.249 "iscsi_get_portal_groups", 00:07:47.249 "iscsi_delete_target_node", 00:07:47.249 "iscsi_target_node_remove_pg_ig_maps", 00:07:47.249 "iscsi_target_node_add_pg_ig_maps", 00:07:47.249 "iscsi_create_target_node", 00:07:47.249 "iscsi_get_target_nodes", 00:07:47.249 "iscsi_delete_initiator_group", 00:07:47.249 "iscsi_initiator_group_remove_initiators", 00:07:47.249 "iscsi_initiator_group_add_initiators", 00:07:47.249 "iscsi_create_initiator_group", 00:07:47.249 "iscsi_get_initiator_groups", 00:07:47.249 "fsdev_aio_delete", 00:07:47.249 "fsdev_aio_create", 00:07:47.249 "keyring_linux_set_options", 00:07:47.249 "keyring_file_remove_key", 00:07:47.249 "keyring_file_add_key", 00:07:47.249 "vfu_virtio_create_fs_endpoint", 00:07:47.249 "vfu_virtio_create_scsi_endpoint", 00:07:47.249 "vfu_virtio_scsi_remove_target", 00:07:47.249 "vfu_virtio_scsi_add_target", 00:07:47.249 "vfu_virtio_create_blk_endpoint", 00:07:47.249 "vfu_virtio_delete_endpoint", 00:07:47.249 "iaa_scan_accel_module", 00:07:47.249 "dsa_scan_accel_module", 00:07:47.249 "ioat_scan_accel_module", 00:07:47.249 "accel_error_inject_error", 00:07:47.249 "bdev_iscsi_delete", 00:07:47.249 "bdev_iscsi_create", 00:07:47.249 "bdev_iscsi_set_options", 00:07:47.249 "bdev_virtio_attach_controller", 00:07:47.249 "bdev_virtio_scsi_get_devices", 00:07:47.249 "bdev_virtio_detach_controller", 00:07:47.249 "bdev_virtio_blk_set_hotplug", 00:07:47.249 "bdev_ftl_set_property", 00:07:47.249 "bdev_ftl_get_properties", 00:07:47.249 "bdev_ftl_get_stats", 00:07:47.249 "bdev_ftl_unmap", 00:07:47.249 "bdev_ftl_unload", 00:07:47.249 "bdev_ftl_delete", 00:07:47.249 "bdev_ftl_load", 00:07:47.249 "bdev_ftl_create", 00:07:47.249 "bdev_aio_delete", 00:07:47.249 "bdev_aio_rescan", 00:07:47.249 "bdev_aio_create", 00:07:47.249 "blobfs_create", 00:07:47.249 "blobfs_detect", 00:07:47.249 "blobfs_set_cache_size", 00:07:47.249 "bdev_zone_block_delete", 00:07:47.249 "bdev_zone_block_create", 00:07:47.249 "bdev_delay_delete", 00:07:47.249 "bdev_delay_create", 00:07:47.249 "bdev_delay_update_latency", 00:07:47.249 "bdev_split_delete", 00:07:47.249 "bdev_split_create", 00:07:47.249 "bdev_error_inject_error", 00:07:47.249 "bdev_error_delete", 00:07:47.249 "bdev_error_create", 00:07:47.249 "bdev_raid_set_options", 00:07:47.249 "bdev_raid_remove_base_bdev", 00:07:47.249 "bdev_raid_add_base_bdev", 00:07:47.249 "bdev_raid_delete", 00:07:47.249 "bdev_raid_create", 00:07:47.249 "bdev_raid_get_bdevs", 00:07:47.249 "bdev_lvol_set_parent_bdev", 00:07:47.249 "bdev_lvol_set_parent", 00:07:47.249 "bdev_lvol_check_shallow_copy", 00:07:47.249 "bdev_lvol_start_shallow_copy", 00:07:47.249 "bdev_lvol_grow_lvstore", 00:07:47.249 "bdev_lvol_get_lvols", 00:07:47.249 "bdev_lvol_get_lvstores", 00:07:47.249 "bdev_lvol_delete", 00:07:47.249 "bdev_lvol_set_read_only", 00:07:47.249 "bdev_lvol_resize", 00:07:47.249 "bdev_lvol_decouple_parent", 00:07:47.249 "bdev_lvol_inflate", 00:07:47.249 "bdev_lvol_rename", 00:07:47.249 "bdev_lvol_clone_bdev", 00:07:47.249 "bdev_lvol_clone", 00:07:47.249 "bdev_lvol_snapshot", 00:07:47.249 "bdev_lvol_create", 00:07:47.249 "bdev_lvol_delete_lvstore", 00:07:47.249 "bdev_lvol_rename_lvstore", 00:07:47.249 "bdev_lvol_create_lvstore", 00:07:47.249 "bdev_passthru_delete", 00:07:47.249 "bdev_passthru_create", 00:07:47.249 "bdev_nvme_cuse_unregister", 00:07:47.249 "bdev_nvme_cuse_register", 00:07:47.249 "bdev_opal_new_user", 00:07:47.249 "bdev_opal_set_lock_state", 00:07:47.249 "bdev_opal_delete", 00:07:47.249 "bdev_opal_get_info", 00:07:47.249 "bdev_opal_create", 00:07:47.249 "bdev_nvme_opal_revert", 00:07:47.249 "bdev_nvme_opal_init", 00:07:47.249 "bdev_nvme_send_cmd", 00:07:47.249 "bdev_nvme_set_keys", 00:07:47.249 "bdev_nvme_get_path_iostat", 00:07:47.249 "bdev_nvme_get_mdns_discovery_info", 00:07:47.249 "bdev_nvme_stop_mdns_discovery", 00:07:47.249 "bdev_nvme_start_mdns_discovery", 00:07:47.249 "bdev_nvme_set_multipath_policy", 00:07:47.249 "bdev_nvme_set_preferred_path", 00:07:47.249 "bdev_nvme_get_io_paths", 00:07:47.249 "bdev_nvme_remove_error_injection", 00:07:47.249 "bdev_nvme_add_error_injection", 00:07:47.250 "bdev_nvme_get_discovery_info", 00:07:47.250 "bdev_nvme_stop_discovery", 00:07:47.250 "bdev_nvme_start_discovery", 00:07:47.250 "bdev_nvme_get_controller_health_info", 00:07:47.250 "bdev_nvme_disable_controller", 00:07:47.250 "bdev_nvme_enable_controller", 00:07:47.250 "bdev_nvme_reset_controller", 00:07:47.250 "bdev_nvme_get_transport_statistics", 00:07:47.250 "bdev_nvme_apply_firmware", 00:07:47.250 "bdev_nvme_detach_controller", 00:07:47.250 "bdev_nvme_get_controllers", 00:07:47.250 "bdev_nvme_attach_controller", 00:07:47.250 "bdev_nvme_set_hotplug", 00:07:47.250 "bdev_nvme_set_options", 00:07:47.250 "bdev_null_resize", 00:07:47.250 "bdev_null_delete", 00:07:47.250 "bdev_null_create", 00:07:47.250 "bdev_malloc_delete", 00:07:47.250 "bdev_malloc_create" 00:07:47.250 ] 00:07:47.250 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:47.250 15:09:25 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:47.250 15:09:25 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:47.250 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:47.250 15:09:25 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1465934 00:07:47.250 15:09:25 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 1465934 ']' 00:07:47.250 15:09:25 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 1465934 00:07:47.250 15:09:25 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:07:47.250 15:09:25 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:47.250 15:09:25 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1465934 00:07:47.508 15:09:25 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:47.508 15:09:25 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:47.508 15:09:25 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1465934' 00:07:47.509 killing process with pid 1465934 00:07:47.509 15:09:25 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 1465934 00:07:47.509 15:09:25 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 1465934 00:07:47.768 00:07:47.768 real 0m1.165s 00:07:47.768 user 0m1.924s 00:07:47.768 sys 0m0.511s 00:07:47.768 15:09:26 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:47.768 15:09:26 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:47.768 ************************************ 00:07:47.768 END TEST spdkcli_tcp 00:07:47.768 ************************************ 00:07:47.768 15:09:26 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:47.768 15:09:26 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:47.768 15:09:26 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:47.768 15:09:26 -- common/autotest_common.sh@10 -- # set +x 00:07:47.768 ************************************ 00:07:47.768 START TEST dpdk_mem_utility 00:07:47.768 ************************************ 00:07:47.768 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:48.027 * Looking for test storage... 00:07:48.027 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lcov --version 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:48.027 15:09:26 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:48.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.027 --rc genhtml_branch_coverage=1 00:07:48.027 --rc genhtml_function_coverage=1 00:07:48.027 --rc genhtml_legend=1 00:07:48.027 --rc geninfo_all_blocks=1 00:07:48.027 --rc geninfo_unexecuted_blocks=1 00:07:48.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.027 ' 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:48.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.027 --rc genhtml_branch_coverage=1 00:07:48.027 --rc genhtml_function_coverage=1 00:07:48.027 --rc genhtml_legend=1 00:07:48.027 --rc geninfo_all_blocks=1 00:07:48.027 --rc geninfo_unexecuted_blocks=1 00:07:48.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.027 ' 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:48.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.027 --rc genhtml_branch_coverage=1 00:07:48.027 --rc genhtml_function_coverage=1 00:07:48.027 --rc genhtml_legend=1 00:07:48.027 --rc geninfo_all_blocks=1 00:07:48.027 --rc geninfo_unexecuted_blocks=1 00:07:48.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.027 ' 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:48.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.027 --rc genhtml_branch_coverage=1 00:07:48.027 --rc genhtml_function_coverage=1 00:07:48.027 --rc genhtml_legend=1 00:07:48.027 --rc geninfo_all_blocks=1 00:07:48.027 --rc geninfo_unexecuted_blocks=1 00:07:48.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.027 ' 00:07:48.027 15:09:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:48.027 15:09:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1466142 00:07:48.027 15:09:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:48.027 15:09:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1466142 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 1466142 ']' 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:48.027 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:48.027 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:48.027 [2024-11-20 15:09:26.620192] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:48.027 [2024-11-20 15:09:26.620272] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1466142 ] 00:07:48.285 [2024-11-20 15:09:26.720838] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.285 [2024-11-20 15:09:26.755048] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.544 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:48.544 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:07:48.544 15:09:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:48.544 15:09:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:48.544 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:48.544 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:48.544 { 00:07:48.544 "filename": "/tmp/spdk_mem_dump.txt" 00:07:48.544 } 00:07:48.544 15:09:26 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:48.544 15:09:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:48.544 DPDK memory size 810.000000 MiB in 1 heap(s) 00:07:48.544 1 heaps totaling size 810.000000 MiB 00:07:48.544 size: 810.000000 MiB heap id: 0 00:07:48.544 end heaps---------- 00:07:48.544 9 mempools totaling size 595.772034 MiB 00:07:48.545 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:48.545 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:48.545 size: 92.545471 MiB name: bdev_io_1466142 00:07:48.545 size: 50.003479 MiB name: msgpool_1466142 00:07:48.545 size: 36.509338 MiB name: fsdev_io_1466142 00:07:48.545 size: 21.763794 MiB name: PDU_Pool 00:07:48.545 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:48.545 size: 4.133484 MiB name: evtpool_1466142 00:07:48.545 size: 0.026123 MiB name: Session_Pool 00:07:48.545 end mempools------- 00:07:48.545 6 memzones totaling size 4.142822 MiB 00:07:48.545 size: 1.000366 MiB name: RG_ring_0_1466142 00:07:48.545 size: 1.000366 MiB name: RG_ring_1_1466142 00:07:48.545 size: 1.000366 MiB name: RG_ring_4_1466142 00:07:48.545 size: 1.000366 MiB name: RG_ring_5_1466142 00:07:48.545 size: 0.125366 MiB name: RG_ring_2_1466142 00:07:48.545 size: 0.015991 MiB name: RG_ring_3_1466142 00:07:48.545 end memzones------- 00:07:48.545 15:09:27 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:48.545 heap id: 0 total size: 810.000000 MiB number of busy elements: 44 number of free elements: 15 00:07:48.545 list of free elements. size: 10.862488 MiB 00:07:48.545 element at address: 0x200018a00000 with size: 0.999878 MiB 00:07:48.545 element at address: 0x200018c00000 with size: 0.999878 MiB 00:07:48.545 element at address: 0x200000400000 with size: 0.998535 MiB 00:07:48.545 element at address: 0x200031800000 with size: 0.994446 MiB 00:07:48.545 element at address: 0x200008000000 with size: 0.959839 MiB 00:07:48.545 element at address: 0x200012c00000 with size: 0.954285 MiB 00:07:48.545 element at address: 0x200018e00000 with size: 0.936584 MiB 00:07:48.545 element at address: 0x200000200000 with size: 0.717346 MiB 00:07:48.545 element at address: 0x20001a600000 with size: 0.582886 MiB 00:07:48.545 element at address: 0x200000c00000 with size: 0.495422 MiB 00:07:48.545 element at address: 0x200003e00000 with size: 0.490723 MiB 00:07:48.545 element at address: 0x200019000000 with size: 0.485657 MiB 00:07:48.545 element at address: 0x200010600000 with size: 0.481934 MiB 00:07:48.545 element at address: 0x200027a00000 with size: 0.410034 MiB 00:07:48.545 element at address: 0x200000800000 with size: 0.355042 MiB 00:07:48.545 list of standard malloc elements. size: 199.218628 MiB 00:07:48.545 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:07:48.545 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:07:48.545 element at address: 0x200018afff80 with size: 1.000122 MiB 00:07:48.545 element at address: 0x200018cfff80 with size: 1.000122 MiB 00:07:48.545 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:48.545 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:48.545 element at address: 0x200018eeff00 with size: 0.062622 MiB 00:07:48.545 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:48.545 element at address: 0x200018eefdc0 with size: 0.000305 MiB 00:07:48.545 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x20000085b040 with size: 0.000183 MiB 00:07:48.545 element at address: 0x20000085b100 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000008df880 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200000cff000 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:07:48.545 element at address: 0x20001067b600 with size: 0.000183 MiB 00:07:48.545 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200012cf44c0 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200018eefc40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200018eefd00 with size: 0.000183 MiB 00:07:48.545 element at address: 0x2000190bc740 with size: 0.000183 MiB 00:07:48.545 element at address: 0x20001a695380 with size: 0.000183 MiB 00:07:48.545 element at address: 0x20001a695440 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200027a68f80 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200027a69040 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200027a6fc40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200027a6fe40 with size: 0.000183 MiB 00:07:48.545 element at address: 0x200027a6ff00 with size: 0.000183 MiB 00:07:48.545 list of memzone associated elements. size: 599.918884 MiB 00:07:48.545 element at address: 0x20001a695500 with size: 211.416748 MiB 00:07:48.545 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:48.545 element at address: 0x200027a6ffc0 with size: 157.562561 MiB 00:07:48.545 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:48.545 element at address: 0x200012df4780 with size: 92.045044 MiB 00:07:48.545 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_1466142_0 00:07:48.545 element at address: 0x200000dff380 with size: 48.003052 MiB 00:07:48.545 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1466142_0 00:07:48.545 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:07:48.545 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_1466142_0 00:07:48.545 element at address: 0x2000191be940 with size: 20.255554 MiB 00:07:48.545 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:48.545 element at address: 0x2000319feb40 with size: 18.005066 MiB 00:07:48.545 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:48.545 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:07:48.545 associated memzone info: size: 3.000122 MiB name: MP_evtpool_1466142_0 00:07:48.545 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:07:48.545 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1466142 00:07:48.545 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:07:48.545 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1466142 00:07:48.545 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:07:48.545 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:48.545 element at address: 0x2000190bc800 with size: 1.008118 MiB 00:07:48.545 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:48.545 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:07:48.545 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:48.545 element at address: 0x200003efde40 with size: 1.008118 MiB 00:07:48.545 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:48.545 element at address: 0x200000cff180 with size: 1.000488 MiB 00:07:48.545 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1466142 00:07:48.545 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:07:48.545 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1466142 00:07:48.545 element at address: 0x200012cf4580 with size: 1.000488 MiB 00:07:48.545 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1466142 00:07:48.545 element at address: 0x2000318fe940 with size: 1.000488 MiB 00:07:48.545 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1466142 00:07:48.545 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:07:48.545 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_1466142 00:07:48.545 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:07:48.545 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1466142 00:07:48.545 element at address: 0x20001067b780 with size: 0.500488 MiB 00:07:48.545 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:48.545 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:07:48.545 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:48.545 element at address: 0x20001907c540 with size: 0.250488 MiB 00:07:48.545 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:48.545 element at address: 0x2000002b7a40 with size: 0.125488 MiB 00:07:48.545 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_1466142 00:07:48.545 element at address: 0x2000008df940 with size: 0.125488 MiB 00:07:48.545 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1466142 00:07:48.545 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:07:48.545 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:48.545 element at address: 0x200027a69100 with size: 0.023743 MiB 00:07:48.545 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:48.545 element at address: 0x2000008db680 with size: 0.016113 MiB 00:07:48.545 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1466142 00:07:48.545 element at address: 0x200027a6f240 with size: 0.002441 MiB 00:07:48.545 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:48.545 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:07:48.545 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1466142 00:07:48.545 element at address: 0x2000008db480 with size: 0.000305 MiB 00:07:48.545 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_1466142 00:07:48.545 element at address: 0x20000085af00 with size: 0.000305 MiB 00:07:48.545 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1466142 00:07:48.545 element at address: 0x200027a6fd00 with size: 0.000305 MiB 00:07:48.545 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:48.546 15:09:27 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:48.546 15:09:27 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1466142 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 1466142 ']' 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 1466142 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1466142 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1466142' 00:07:48.546 killing process with pid 1466142 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 1466142 00:07:48.546 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 1466142 00:07:48.803 00:07:48.803 real 0m1.088s 00:07:48.803 user 0m0.946s 00:07:48.803 sys 0m0.510s 00:07:48.803 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:48.803 15:09:27 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:48.803 ************************************ 00:07:48.803 END TEST dpdk_mem_utility 00:07:48.803 ************************************ 00:07:49.061 15:09:27 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:49.061 15:09:27 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:49.061 15:09:27 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:49.061 15:09:27 -- common/autotest_common.sh@10 -- # set +x 00:07:49.061 ************************************ 00:07:49.061 START TEST event 00:07:49.061 ************************************ 00:07:49.061 15:09:27 event -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:49.061 * Looking for test storage... 00:07:49.061 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:49.061 15:09:27 event -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:49.061 15:09:27 event -- common/autotest_common.sh@1693 -- # lcov --version 00:07:49.061 15:09:27 event -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:49.061 15:09:27 event -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:49.061 15:09:27 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:49.061 15:09:27 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:49.061 15:09:27 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:49.061 15:09:27 event -- scripts/common.sh@336 -- # IFS=.-: 00:07:49.061 15:09:27 event -- scripts/common.sh@336 -- # read -ra ver1 00:07:49.061 15:09:27 event -- scripts/common.sh@337 -- # IFS=.-: 00:07:49.061 15:09:27 event -- scripts/common.sh@337 -- # read -ra ver2 00:07:49.061 15:09:27 event -- scripts/common.sh@338 -- # local 'op=<' 00:07:49.061 15:09:27 event -- scripts/common.sh@340 -- # ver1_l=2 00:07:49.061 15:09:27 event -- scripts/common.sh@341 -- # ver2_l=1 00:07:49.061 15:09:27 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:49.061 15:09:27 event -- scripts/common.sh@344 -- # case "$op" in 00:07:49.061 15:09:27 event -- scripts/common.sh@345 -- # : 1 00:07:49.061 15:09:27 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:49.061 15:09:27 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:49.061 15:09:27 event -- scripts/common.sh@365 -- # decimal 1 00:07:49.320 15:09:27 event -- scripts/common.sh@353 -- # local d=1 00:07:49.320 15:09:27 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:49.320 15:09:27 event -- scripts/common.sh@355 -- # echo 1 00:07:49.320 15:09:27 event -- scripts/common.sh@365 -- # ver1[v]=1 00:07:49.320 15:09:27 event -- scripts/common.sh@366 -- # decimal 2 00:07:49.320 15:09:27 event -- scripts/common.sh@353 -- # local d=2 00:07:49.320 15:09:27 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:49.320 15:09:27 event -- scripts/common.sh@355 -- # echo 2 00:07:49.320 15:09:27 event -- scripts/common.sh@366 -- # ver2[v]=2 00:07:49.320 15:09:27 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:49.320 15:09:27 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:49.320 15:09:27 event -- scripts/common.sh@368 -- # return 0 00:07:49.320 15:09:27 event -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:49.320 15:09:27 event -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:49.320 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.320 --rc genhtml_branch_coverage=1 00:07:49.320 --rc genhtml_function_coverage=1 00:07:49.320 --rc genhtml_legend=1 00:07:49.320 --rc geninfo_all_blocks=1 00:07:49.320 --rc geninfo_unexecuted_blocks=1 00:07:49.320 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.320 ' 00:07:49.320 15:09:27 event -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:49.320 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.320 --rc genhtml_branch_coverage=1 00:07:49.320 --rc genhtml_function_coverage=1 00:07:49.320 --rc genhtml_legend=1 00:07:49.320 --rc geninfo_all_blocks=1 00:07:49.320 --rc geninfo_unexecuted_blocks=1 00:07:49.320 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.320 ' 00:07:49.320 15:09:27 event -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:49.320 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.320 --rc genhtml_branch_coverage=1 00:07:49.320 --rc genhtml_function_coverage=1 00:07:49.320 --rc genhtml_legend=1 00:07:49.320 --rc geninfo_all_blocks=1 00:07:49.320 --rc geninfo_unexecuted_blocks=1 00:07:49.320 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.320 ' 00:07:49.320 15:09:27 event -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:49.320 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.320 --rc genhtml_branch_coverage=1 00:07:49.320 --rc genhtml_function_coverage=1 00:07:49.320 --rc genhtml_legend=1 00:07:49.320 --rc geninfo_all_blocks=1 00:07:49.320 --rc geninfo_unexecuted_blocks=1 00:07:49.320 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.320 ' 00:07:49.320 15:09:27 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:49.320 15:09:27 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:49.320 15:09:27 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:49.320 15:09:27 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:07:49.320 15:09:27 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:49.320 15:09:27 event -- common/autotest_common.sh@10 -- # set +x 00:07:49.320 ************************************ 00:07:49.320 START TEST event_perf 00:07:49.320 ************************************ 00:07:49.320 15:09:27 event.event_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:49.320 Running I/O for 1 seconds...[2024-11-20 15:09:27.801273] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:49.320 [2024-11-20 15:09:27.801363] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1466338 ] 00:07:49.320 [2024-11-20 15:09:27.888880] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:49.320 [2024-11-20 15:09:27.916462] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:49.320 [2024-11-20 15:09:27.916551] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:49.320 [2024-11-20 15:09:27.916626] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:49.320 [2024-11-20 15:09:27.916627] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.701 Running I/O for 1 seconds... 00:07:50.701 lcore 0: 190543 00:07:50.701 lcore 1: 190543 00:07:50.701 lcore 2: 190543 00:07:50.701 lcore 3: 190543 00:07:50.701 done. 00:07:50.701 00:07:50.701 real 0m1.172s 00:07:50.701 user 0m4.081s 00:07:50.701 sys 0m0.088s 00:07:50.701 15:09:28 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:50.701 15:09:28 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:50.701 ************************************ 00:07:50.701 END TEST event_perf 00:07:50.701 ************************************ 00:07:50.701 15:09:28 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:50.701 15:09:28 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:50.701 15:09:28 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:50.701 15:09:28 event -- common/autotest_common.sh@10 -- # set +x 00:07:50.701 ************************************ 00:07:50.701 START TEST event_reactor 00:07:50.701 ************************************ 00:07:50.701 15:09:29 event.event_reactor -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:50.701 [2024-11-20 15:09:29.047776] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:50.701 [2024-11-20 15:09:29.047881] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1466537 ] 00:07:50.701 [2024-11-20 15:09:29.136785] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.701 [2024-11-20 15:09:29.161979] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.642 test_start 00:07:51.642 oneshot 00:07:51.642 tick 100 00:07:51.642 tick 100 00:07:51.642 tick 250 00:07:51.642 tick 100 00:07:51.642 tick 100 00:07:51.642 tick 100 00:07:51.642 tick 250 00:07:51.642 tick 500 00:07:51.642 tick 100 00:07:51.642 tick 100 00:07:51.642 tick 250 00:07:51.642 tick 100 00:07:51.642 tick 100 00:07:51.642 test_end 00:07:51.642 00:07:51.642 real 0m1.170s 00:07:51.642 user 0m1.073s 00:07:51.642 sys 0m0.093s 00:07:51.642 15:09:30 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:51.642 15:09:30 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:51.642 ************************************ 00:07:51.642 END TEST event_reactor 00:07:51.642 ************************************ 00:07:51.642 15:09:30 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:51.642 15:09:30 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:51.642 15:09:30 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:51.642 15:09:30 event -- common/autotest_common.sh@10 -- # set +x 00:07:51.642 ************************************ 00:07:51.642 START TEST event_reactor_perf 00:07:51.642 ************************************ 00:07:51.642 15:09:30 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:51.642 [2024-11-20 15:09:30.288903] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:51.642 [2024-11-20 15:09:30.289011] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1466737 ] 00:07:51.900 [2024-11-20 15:09:30.378498] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.900 [2024-11-20 15:09:30.404444] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.835 test_start 00:07:52.835 test_end 00:07:52.835 Performance: 926072 events per second 00:07:52.835 00:07:52.835 real 0m1.169s 00:07:52.835 user 0m1.068s 00:07:52.835 sys 0m0.097s 00:07:52.835 15:09:31 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:52.835 15:09:31 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:52.835 ************************************ 00:07:52.835 END TEST event_reactor_perf 00:07:52.835 ************************************ 00:07:52.835 15:09:31 event -- event/event.sh@49 -- # uname -s 00:07:52.835 15:09:31 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:52.835 15:09:31 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:52.835 15:09:31 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:52.835 15:09:31 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:52.835 15:09:31 event -- common/autotest_common.sh@10 -- # set +x 00:07:52.835 ************************************ 00:07:52.835 START TEST event_scheduler 00:07:52.835 ************************************ 00:07:52.835 15:09:31 event.event_scheduler -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:53.093 * Looking for test storage... 00:07:53.093 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:53.093 15:09:31 event.event_scheduler -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:53.093 15:09:31 event.event_scheduler -- common/autotest_common.sh@1693 -- # lcov --version 00:07:53.093 15:09:31 event.event_scheduler -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:53.093 15:09:31 event.event_scheduler -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:53.093 15:09:31 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:07:53.094 15:09:31 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:07:53.094 15:09:31 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:53.094 15:09:31 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:53.094 15:09:31 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:53.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.094 --rc genhtml_branch_coverage=1 00:07:53.094 --rc genhtml_function_coverage=1 00:07:53.094 --rc genhtml_legend=1 00:07:53.094 --rc geninfo_all_blocks=1 00:07:53.094 --rc geninfo_unexecuted_blocks=1 00:07:53.094 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.094 ' 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:53.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.094 --rc genhtml_branch_coverage=1 00:07:53.094 --rc genhtml_function_coverage=1 00:07:53.094 --rc genhtml_legend=1 00:07:53.094 --rc geninfo_all_blocks=1 00:07:53.094 --rc geninfo_unexecuted_blocks=1 00:07:53.094 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.094 ' 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:53.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.094 --rc genhtml_branch_coverage=1 00:07:53.094 --rc genhtml_function_coverage=1 00:07:53.094 --rc genhtml_legend=1 00:07:53.094 --rc geninfo_all_blocks=1 00:07:53.094 --rc geninfo_unexecuted_blocks=1 00:07:53.094 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.094 ' 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:53.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.094 --rc genhtml_branch_coverage=1 00:07:53.094 --rc genhtml_function_coverage=1 00:07:53.094 --rc genhtml_legend=1 00:07:53.094 --rc geninfo_all_blocks=1 00:07:53.094 --rc geninfo_unexecuted_blocks=1 00:07:53.094 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.094 ' 00:07:53.094 15:09:31 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:53.094 15:09:31 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1466967 00:07:53.094 15:09:31 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:53.094 15:09:31 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:53.094 15:09:31 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1466967 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 1466967 ']' 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:53.094 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:53.094 15:09:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:53.094 [2024-11-20 15:09:31.727451] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:53.094 [2024-11-20 15:09:31.727549] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1466967 ] 00:07:53.352 [2024-11-20 15:09:31.812937] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:53.352 [2024-11-20 15:09:31.840232] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.352 [2024-11-20 15:09:31.840310] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:53.352 [2024-11-20 15:09:31.840388] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:53.352 [2024-11-20 15:09:31.840390] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:53.352 15:09:31 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:53.352 15:09:31 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:07:53.352 15:09:31 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:53.353 [2024-11-20 15:09:31.925098] dpdk_governor.c: 178:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:53.353 [2024-11-20 15:09:31.925121] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:07:53.353 [2024-11-20 15:09:31.925133] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:53.353 [2024-11-20 15:09:31.925141] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:53.353 [2024-11-20 15:09:31.925148] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.353 15:09:31 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:53.353 [2024-11-20 15:09:31.996684] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.353 15:09:31 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:53.353 15:09:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:53.353 ************************************ 00:07:53.353 START TEST scheduler_create_thread 00:07:53.353 ************************************ 00:07:53.353 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:07:53.353 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:53.353 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.353 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.611 2 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.611 3 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.611 4 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.611 5 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.611 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.611 6 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.612 7 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.612 8 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.612 9 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.612 10 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.612 15:09:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:54.987 15:09:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:54.987 15:09:33 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:54.987 15:09:33 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:54.987 15:09:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:54.987 15:09:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:56.363 15:09:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:56.363 00:07:56.363 real 0m2.619s 00:07:56.363 user 0m0.023s 00:07:56.363 sys 0m0.008s 00:07:56.363 15:09:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:56.363 15:09:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:56.363 ************************************ 00:07:56.363 END TEST scheduler_create_thread 00:07:56.363 ************************************ 00:07:56.363 15:09:34 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:56.363 15:09:34 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1466967 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 1466967 ']' 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 1466967 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1466967 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1466967' 00:07:56.363 killing process with pid 1466967 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 1466967 00:07:56.363 15:09:34 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 1466967 00:07:56.622 [2024-11-20 15:09:35.135301] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:56.881 00:07:56.881 real 0m3.802s 00:07:56.881 user 0m5.743s 00:07:56.881 sys 0m0.454s 00:07:56.881 15:09:35 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:56.881 15:09:35 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:56.881 ************************************ 00:07:56.881 END TEST event_scheduler 00:07:56.881 ************************************ 00:07:56.881 15:09:35 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:56.881 15:09:35 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:56.881 15:09:35 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:56.881 15:09:35 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:56.881 15:09:35 event -- common/autotest_common.sh@10 -- # set +x 00:07:56.881 ************************************ 00:07:56.881 START TEST app_repeat 00:07:56.881 ************************************ 00:07:56.881 15:09:35 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1467554 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1467554' 00:07:56.881 Process app_repeat pid: 1467554 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:56.881 spdk_app_start Round 0 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1467554 /var/tmp/spdk-nbd.sock 00:07:56.881 15:09:35 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1467554 ']' 00:07:56.881 15:09:35 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:56.881 15:09:35 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:56.881 15:09:35 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:56.881 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:56.881 15:09:35 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:56.881 15:09:35 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:56.881 15:09:35 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:56.881 [2024-11-20 15:09:35.414617] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:07:56.881 [2024-11-20 15:09:35.414700] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1467554 ] 00:07:56.881 [2024-11-20 15:09:35.502063] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:56.881 [2024-11-20 15:09:35.527412] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:56.881 [2024-11-20 15:09:35.527416] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.141 15:09:35 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:57.141 15:09:35 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:57.141 15:09:35 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:57.141 Malloc0 00:07:57.141 15:09:35 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:57.399 Malloc1 00:07:57.399 15:09:36 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:57.399 15:09:36 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:57.657 /dev/nbd0 00:07:57.657 15:09:36 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:57.657 15:09:36 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:57.657 15:09:36 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:57.658 1+0 records in 00:07:57.658 1+0 records out 00:07:57.658 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239341 s, 17.1 MB/s 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:57.658 15:09:36 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:57.658 15:09:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:57.658 15:09:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:57.658 15:09:36 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:57.916 /dev/nbd1 00:07:57.916 15:09:36 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:57.916 15:09:36 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:57.916 1+0 records in 00:07:57.916 1+0 records out 00:07:57.916 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324556 s, 12.6 MB/s 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:57.916 15:09:36 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:57.916 15:09:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:57.916 15:09:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:57.916 15:09:36 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:57.916 15:09:36 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:57.916 15:09:36 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:58.175 { 00:07:58.175 "nbd_device": "/dev/nbd0", 00:07:58.175 "bdev_name": "Malloc0" 00:07:58.175 }, 00:07:58.175 { 00:07:58.175 "nbd_device": "/dev/nbd1", 00:07:58.175 "bdev_name": "Malloc1" 00:07:58.175 } 00:07:58.175 ]' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:58.175 { 00:07:58.175 "nbd_device": "/dev/nbd0", 00:07:58.175 "bdev_name": "Malloc0" 00:07:58.175 }, 00:07:58.175 { 00:07:58.175 "nbd_device": "/dev/nbd1", 00:07:58.175 "bdev_name": "Malloc1" 00:07:58.175 } 00:07:58.175 ]' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:58.175 /dev/nbd1' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:58.175 /dev/nbd1' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:58.175 256+0 records in 00:07:58.175 256+0 records out 00:07:58.175 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107409 s, 97.6 MB/s 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:58.175 256+0 records in 00:07:58.175 256+0 records out 00:07:58.175 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200051 s, 52.4 MB/s 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:58.175 256+0 records in 00:07:58.175 256+0 records out 00:07:58.175 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214867 s, 48.8 MB/s 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:58.175 15:09:36 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:58.434 15:09:36 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:58.434 15:09:36 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:58.434 15:09:36 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:58.434 15:09:36 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:58.434 15:09:36 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:58.434 15:09:36 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:58.434 15:09:36 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:58.434 15:09:36 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:58.434 15:09:37 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:58.692 15:09:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:58.951 15:09:37 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:58.951 15:09:37 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:59.209 15:09:37 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:59.468 [2024-11-20 15:09:37.910612] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:59.468 [2024-11-20 15:09:37.934123] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:59.468 [2024-11-20 15:09:37.934125] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.468 [2024-11-20 15:09:37.975679] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:59.468 [2024-11-20 15:09:37.975722] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:02.760 15:09:40 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:02.760 15:09:40 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:08:02.760 spdk_app_start Round 1 00:08:02.760 15:09:40 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1467554 /var/tmp/spdk-nbd.sock 00:08:02.760 15:09:40 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1467554 ']' 00:08:02.760 15:09:40 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:02.760 15:09:40 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:02.760 15:09:40 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:02.760 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:02.760 15:09:40 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:02.760 15:09:40 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:02.760 15:09:40 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:02.760 15:09:40 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:08:02.760 15:09:40 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:02.760 Malloc0 00:08:02.760 15:09:41 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:02.760 Malloc1 00:08:02.760 15:09:41 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:02.760 15:09:41 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:02.760 15:09:41 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:02.760 15:09:41 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:02.760 15:09:41 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:02.760 15:09:41 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:02.761 15:09:41 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:03.020 /dev/nbd0 00:08:03.020 15:09:41 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:03.020 15:09:41 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:03.020 1+0 records in 00:08:03.020 1+0 records out 00:08:03.020 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000222597 s, 18.4 MB/s 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:08:03.020 15:09:41 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:08:03.020 15:09:41 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:03.020 15:09:41 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:03.020 15:09:41 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:03.279 /dev/nbd1 00:08:03.279 15:09:41 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:03.279 15:09:41 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:03.279 1+0 records in 00:08:03.279 1+0 records out 00:08:03.279 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253087 s, 16.2 MB/s 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:08:03.279 15:09:41 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:08:03.279 15:09:41 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:03.279 15:09:41 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:03.279 15:09:41 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:03.279 15:09:41 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:03.279 15:09:41 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:03.538 15:09:42 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:03.538 { 00:08:03.538 "nbd_device": "/dev/nbd0", 00:08:03.538 "bdev_name": "Malloc0" 00:08:03.538 }, 00:08:03.538 { 00:08:03.538 "nbd_device": "/dev/nbd1", 00:08:03.538 "bdev_name": "Malloc1" 00:08:03.538 } 00:08:03.539 ]' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:03.539 { 00:08:03.539 "nbd_device": "/dev/nbd0", 00:08:03.539 "bdev_name": "Malloc0" 00:08:03.539 }, 00:08:03.539 { 00:08:03.539 "nbd_device": "/dev/nbd1", 00:08:03.539 "bdev_name": "Malloc1" 00:08:03.539 } 00:08:03.539 ]' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:03.539 /dev/nbd1' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:03.539 /dev/nbd1' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:03.539 256+0 records in 00:08:03.539 256+0 records out 00:08:03.539 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00359119 s, 292 MB/s 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:03.539 256+0 records in 00:08:03.539 256+0 records out 00:08:03.539 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200567 s, 52.3 MB/s 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:03.539 256+0 records in 00:08:03.539 256+0 records out 00:08:03.539 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0218663 s, 48.0 MB/s 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:03.539 15:09:42 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:03.798 15:09:42 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:04.057 15:09:42 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:04.057 15:09:42 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:04.057 15:09:42 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:04.058 15:09:42 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:04.058 15:09:42 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:04.058 15:09:42 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:04.058 15:09:42 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:04.058 15:09:42 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:04.058 15:09:42 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:04.058 15:09:42 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:04.058 15:09:42 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:04.317 15:09:42 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:04.317 15:09:42 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:04.576 15:09:43 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:04.576 [2024-11-20 15:09:43.231796] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:04.576 [2024-11-20 15:09:43.256430] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:08:04.576 [2024-11-20 15:09:43.256432] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.835 [2024-11-20 15:09:43.304101] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:04.835 [2024-11-20 15:09:43.304149] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:08.120 15:09:46 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:08.120 15:09:46 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:08:08.120 spdk_app_start Round 2 00:08:08.120 15:09:46 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1467554 /var/tmp/spdk-nbd.sock 00:08:08.120 15:09:46 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1467554 ']' 00:08:08.120 15:09:46 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:08.120 15:09:46 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:08.120 15:09:46 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:08.120 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:08.120 15:09:46 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:08.120 15:09:46 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:08.120 15:09:46 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:08.120 15:09:46 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:08:08.120 15:09:46 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:08.120 Malloc0 00:08:08.120 15:09:46 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:08.120 Malloc1 00:08:08.120 15:09:46 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:08.120 15:09:46 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:08.379 /dev/nbd0 00:08:08.379 15:09:46 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:08.379 15:09:46 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:08.379 1+0 records in 00:08:08.379 1+0 records out 00:08:08.379 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023117 s, 17.7 MB/s 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:08:08.379 15:09:46 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:08:08.379 15:09:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:08.379 15:09:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:08.379 15:09:46 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:08.637 /dev/nbd1 00:08:08.637 15:09:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:08.637 15:09:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:08.637 1+0 records in 00:08:08.637 1+0 records out 00:08:08.637 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255393 s, 16.0 MB/s 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:08:08.637 15:09:47 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:08:08.637 15:09:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:08.637 15:09:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:08.637 15:09:47 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:08.637 15:09:47 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:08.637 15:09:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:08.895 { 00:08:08.895 "nbd_device": "/dev/nbd0", 00:08:08.895 "bdev_name": "Malloc0" 00:08:08.895 }, 00:08:08.895 { 00:08:08.895 "nbd_device": "/dev/nbd1", 00:08:08.895 "bdev_name": "Malloc1" 00:08:08.895 } 00:08:08.895 ]' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:08.895 { 00:08:08.895 "nbd_device": "/dev/nbd0", 00:08:08.895 "bdev_name": "Malloc0" 00:08:08.895 }, 00:08:08.895 { 00:08:08.895 "nbd_device": "/dev/nbd1", 00:08:08.895 "bdev_name": "Malloc1" 00:08:08.895 } 00:08:08.895 ]' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:08.895 /dev/nbd1' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:08.895 /dev/nbd1' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:08.895 256+0 records in 00:08:08.895 256+0 records out 00:08:08.895 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108845 s, 96.3 MB/s 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:08.895 256+0 records in 00:08:08.895 256+0 records out 00:08:08.895 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206788 s, 50.7 MB/s 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:08.895 256+0 records in 00:08:08.895 256+0 records out 00:08:08.895 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0217168 s, 48.3 MB/s 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:08.895 15:09:47 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:08.896 15:09:47 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:09.154 15:09:47 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:09.412 15:09:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:09.670 15:09:48 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:09.670 15:09:48 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:09.929 15:09:48 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:09.929 [2024-11-20 15:09:48.540370] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:09.929 [2024-11-20 15:09:48.565075] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:08:09.929 [2024-11-20 15:09:48.565077] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.929 [2024-11-20 15:09:48.611886] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:09.929 [2024-11-20 15:09:48.611928] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:13.211 15:09:51 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1467554 /var/tmp/spdk-nbd.sock 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1467554 ']' 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:13.211 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:08:13.211 15:09:51 event.app_repeat -- event/event.sh@39 -- # killprocess 1467554 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 1467554 ']' 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 1467554 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1467554 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1467554' 00:08:13.211 killing process with pid 1467554 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@973 -- # kill 1467554 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@978 -- # wait 1467554 00:08:13.211 spdk_app_start is called in Round 0. 00:08:13.211 Shutdown signal received, stop current app iteration 00:08:13.211 Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 reinitialization... 00:08:13.211 spdk_app_start is called in Round 1. 00:08:13.211 Shutdown signal received, stop current app iteration 00:08:13.211 Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 reinitialization... 00:08:13.211 spdk_app_start is called in Round 2. 00:08:13.211 Shutdown signal received, stop current app iteration 00:08:13.211 Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 reinitialization... 00:08:13.211 spdk_app_start is called in Round 3. 00:08:13.211 Shutdown signal received, stop current app iteration 00:08:13.211 15:09:51 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:08:13.211 15:09:51 event.app_repeat -- event/event.sh@42 -- # return 0 00:08:13.211 00:08:13.211 real 0m16.383s 00:08:13.211 user 0m35.313s 00:08:13.211 sys 0m3.293s 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:13.211 15:09:51 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:13.211 ************************************ 00:08:13.211 END TEST app_repeat 00:08:13.211 ************************************ 00:08:13.211 15:09:51 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:08:13.211 15:09:51 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:08:13.211 15:09:51 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:13.211 15:09:51 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:13.211 15:09:51 event -- common/autotest_common.sh@10 -- # set +x 00:08:13.211 ************************************ 00:08:13.211 START TEST cpu_locks 00:08:13.211 ************************************ 00:08:13.211 15:09:51 event.cpu_locks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:08:13.469 * Looking for test storage... 00:08:13.469 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:08:13.469 15:09:51 event.cpu_locks -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:13.469 15:09:51 event.cpu_locks -- common/autotest_common.sh@1693 -- # lcov --version 00:08:13.469 15:09:51 event.cpu_locks -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:13.469 15:09:52 event.cpu_locks -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:13.469 15:09:52 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:08:13.469 15:09:52 event.cpu_locks -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:13.469 15:09:52 event.cpu_locks -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:13.469 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.469 --rc genhtml_branch_coverage=1 00:08:13.469 --rc genhtml_function_coverage=1 00:08:13.469 --rc genhtml_legend=1 00:08:13.469 --rc geninfo_all_blocks=1 00:08:13.469 --rc geninfo_unexecuted_blocks=1 00:08:13.469 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.469 ' 00:08:13.469 15:09:52 event.cpu_locks -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:13.469 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.469 --rc genhtml_branch_coverage=1 00:08:13.469 --rc genhtml_function_coverage=1 00:08:13.469 --rc genhtml_legend=1 00:08:13.469 --rc geninfo_all_blocks=1 00:08:13.469 --rc geninfo_unexecuted_blocks=1 00:08:13.469 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.469 ' 00:08:13.469 15:09:52 event.cpu_locks -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:13.469 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.470 --rc genhtml_branch_coverage=1 00:08:13.470 --rc genhtml_function_coverage=1 00:08:13.470 --rc genhtml_legend=1 00:08:13.470 --rc geninfo_all_blocks=1 00:08:13.470 --rc geninfo_unexecuted_blocks=1 00:08:13.470 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.470 ' 00:08:13.470 15:09:52 event.cpu_locks -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:13.470 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.470 --rc genhtml_branch_coverage=1 00:08:13.470 --rc genhtml_function_coverage=1 00:08:13.470 --rc genhtml_legend=1 00:08:13.470 --rc geninfo_all_blocks=1 00:08:13.470 --rc geninfo_unexecuted_blocks=1 00:08:13.470 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.470 ' 00:08:13.470 15:09:52 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:08:13.470 15:09:52 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:08:13.470 15:09:52 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:08:13.470 15:09:52 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:08:13.470 15:09:52 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:13.470 15:09:52 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:13.470 15:09:52 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:08:13.470 ************************************ 00:08:13.470 START TEST default_locks 00:08:13.470 ************************************ 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1469936 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 1469936 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 1469936 ']' 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:13.470 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:13.470 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:08:13.470 [2024-11-20 15:09:52.119180] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:13.470 [2024-11-20 15:09:52.119243] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1469936 ] 00:08:13.728 [2024-11-20 15:09:52.207333] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.728 [2024-11-20 15:09:52.231050] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.988 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:13.988 15:09:52 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:08:13.988 15:09:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 1469936 00:08:13.988 15:09:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 1469936 00:08:13.988 15:09:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:14.552 lslocks: write error 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 1469936 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 1469936 ']' 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 1469936 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1469936 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1469936' 00:08:14.552 killing process with pid 1469936 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 1469936 00:08:14.552 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 1469936 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1469936 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1469936 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 1469936 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 1469936 ']' 00:08:15.117 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:15.118 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:08:15.118 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1469936) - No such process 00:08:15.118 ERROR: process (pid: 1469936) is no longer running 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:08:15.118 00:08:15.118 real 0m1.420s 00:08:15.118 user 0m1.415s 00:08:15.118 sys 0m0.711s 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:15.118 15:09:53 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:08:15.118 ************************************ 00:08:15.118 END TEST default_locks 00:08:15.118 ************************************ 00:08:15.118 15:09:53 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:08:15.118 15:09:53 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:15.118 15:09:53 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:15.118 15:09:53 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:08:15.118 ************************************ 00:08:15.118 START TEST default_locks_via_rpc 00:08:15.118 ************************************ 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1470237 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 1470237 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1470237 ']' 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:15.118 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:15.118 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:15.118 [2024-11-20 15:09:53.622795] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:15.118 [2024-11-20 15:09:53.622861] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1470237 ] 00:08:15.118 [2024-11-20 15:09:53.709329] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.118 [2024-11-20 15:09:53.733627] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 1470237 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 1470237 00:08:15.377 15:09:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 1470237 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 1470237 ']' 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 1470237 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1470237 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1470237' 00:08:15.943 killing process with pid 1470237 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 1470237 00:08:15.943 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 1470237 00:08:16.201 00:08:16.201 real 0m1.241s 00:08:16.201 user 0m1.202s 00:08:16.201 sys 0m0.593s 00:08:16.201 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:16.201 15:09:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:16.201 ************************************ 00:08:16.201 END TEST default_locks_via_rpc 00:08:16.201 ************************************ 00:08:16.201 15:09:54 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:08:16.201 15:09:54 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:16.201 15:09:54 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:16.201 15:09:54 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:08:16.458 ************************************ 00:08:16.458 START TEST non_locking_app_on_locked_coremask 00:08:16.458 ************************************ 00:08:16.458 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:08:16.458 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1470434 00:08:16.458 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:16.458 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 1470434 /var/tmp/spdk.sock 00:08:16.458 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1470434 ']' 00:08:16.458 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:16.459 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:16.459 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:16.459 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:16.459 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:16.459 15:09:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:16.459 [2024-11-20 15:09:54.937532] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:16.459 [2024-11-20 15:09:54.937591] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1470434 ] 00:08:16.459 [2024-11-20 15:09:55.026191] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.459 [2024-11-20 15:09:55.050696] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1470479 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 1470479 /var/tmp/spdk2.sock 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1470479 ']' 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:16.717 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:16.717 15:09:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:16.717 [2024-11-20 15:09:55.288574] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:16.717 [2024-11-20 15:09:55.288641] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1470479 ] 00:08:16.975 [2024-11-20 15:09:55.402605] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:16.975 [2024-11-20 15:09:55.402631] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.975 [2024-11-20 15:09:55.445926] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.548 15:09:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:17.548 15:09:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:08:17.548 15:09:56 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 1470434 00:08:17.548 15:09:56 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:17.548 15:09:56 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1470434 00:08:18.927 lslocks: write error 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 1470434 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1470434 ']' 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1470434 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1470434 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1470434' 00:08:18.927 killing process with pid 1470434 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1470434 00:08:18.927 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1470434 00:08:19.494 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 1470479 00:08:19.494 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1470479 ']' 00:08:19.494 15:09:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1470479 00:08:19.494 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:08:19.494 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:19.494 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1470479 00:08:19.494 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:19.494 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:19.494 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1470479' 00:08:19.494 killing process with pid 1470479 00:08:19.494 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1470479 00:08:19.494 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1470479 00:08:19.753 00:08:19.753 real 0m3.474s 00:08:19.753 user 0m3.639s 00:08:19.753 sys 0m1.391s 00:08:19.753 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:19.753 15:09:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:19.753 ************************************ 00:08:19.753 END TEST non_locking_app_on_locked_coremask 00:08:19.754 ************************************ 00:08:19.754 15:09:58 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:08:19.754 15:09:58 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:19.754 15:09:58 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:19.754 15:09:58 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:08:20.012 ************************************ 00:08:20.012 START TEST locking_app_on_unlocked_coremask 00:08:20.012 ************************************ 00:08:20.012 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1470869 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 1470869 /var/tmp/spdk.sock 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1470869 ']' 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:20.013 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:20.013 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:20.013 [2024-11-20 15:09:58.491127] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:20.013 [2024-11-20 15:09:58.491196] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1470869 ] 00:08:20.013 [2024-11-20 15:09:58.587686] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:20.013 [2024-11-20 15:09:58.587730] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.013 [2024-11-20 15:09:58.621566] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1470968 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 1470968 /var/tmp/spdk2.sock 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1470968 ']' 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:20.272 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:20.272 15:09:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:20.272 [2024-11-20 15:09:58.891874] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:20.272 [2024-11-20 15:09:58.891941] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1470968 ] 00:08:20.531 [2024-11-20 15:09:59.010308] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.531 [2024-11-20 15:09:59.057878] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.097 15:09:59 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:21.097 15:09:59 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:08:21.097 15:09:59 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 1470968 00:08:21.097 15:09:59 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:21.097 15:09:59 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1470968 00:08:22.037 lslocks: write error 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 1470869 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1470869 ']' 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 1470869 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1470869 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1470869' 00:08:22.037 killing process with pid 1470869 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 1470869 00:08:22.037 15:10:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 1470869 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 1470968 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1470968 ']' 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 1470968 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1470968 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1470968' 00:08:22.971 killing process with pid 1470968 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 1470968 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 1470968 00:08:22.971 00:08:22.971 real 0m3.185s 00:08:22.971 user 0m3.302s 00:08:22.971 sys 0m1.160s 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:22.971 15:10:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:22.971 ************************************ 00:08:22.971 END TEST locking_app_on_unlocked_coremask 00:08:22.971 ************************************ 00:08:23.230 15:10:01 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:08:23.230 15:10:01 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:23.230 15:10:01 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:23.230 15:10:01 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:08:23.230 ************************************ 00:08:23.230 START TEST locking_app_on_locked_coremask 00:08:23.230 ************************************ 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1471423 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 1471423 /var/tmp/spdk.sock 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1471423 ']' 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:23.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:23.230 15:10:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:23.230 [2024-11-20 15:10:01.764939] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:23.230 [2024-11-20 15:10:01.765011] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1471423 ] 00:08:23.230 [2024-11-20 15:10:01.851403] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.230 [2024-11-20 15:10:01.877209] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1471426 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1471426 /var/tmp/spdk2.sock 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1471426 /var/tmp/spdk2.sock 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 1471426 /var/tmp/spdk2.sock 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1471426 ']' 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:23.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:23.489 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:23.489 [2024-11-20 15:10:02.104518] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:23.489 [2024-11-20 15:10:02.104593] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1471426 ] 00:08:23.748 [2024-11-20 15:10:02.224711] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1471423 has claimed it. 00:08:23.748 [2024-11-20 15:10:02.224753] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:08:24.314 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1471426) - No such process 00:08:24.314 ERROR: process (pid: 1471426) is no longer running 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 1471423 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1471423 00:08:24.314 15:10:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:24.572 lslocks: write error 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 1471423 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1471423 ']' 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1471423 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1471423 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1471423' 00:08:24.572 killing process with pid 1471423 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1471423 00:08:24.572 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1471423 00:08:24.831 00:08:24.831 real 0m1.697s 00:08:24.831 user 0m1.765s 00:08:24.831 sys 0m0.660s 00:08:24.831 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:24.831 15:10:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:24.831 ************************************ 00:08:24.831 END TEST locking_app_on_locked_coremask 00:08:24.831 ************************************ 00:08:24.831 15:10:03 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:08:24.831 15:10:03 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:24.831 15:10:03 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:24.831 15:10:03 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:08:25.089 ************************************ 00:08:25.089 START TEST locking_overlapped_coremask 00:08:25.089 ************************************ 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1471636 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 1471636 /var/tmp/spdk.sock 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 1471636 ']' 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:25.089 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:25.089 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:25.089 [2024-11-20 15:10:03.543266] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:25.090 [2024-11-20 15:10:03.543347] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1471636 ] 00:08:25.090 [2024-11-20 15:10:03.629856] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:25.090 [2024-11-20 15:10:03.658818] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:08:25.090 [2024-11-20 15:10:03.658906] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:08:25.090 [2024-11-20 15:10:03.658907] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1471649 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1471649 /var/tmp/spdk2.sock 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1471649 /var/tmp/spdk2.sock 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 1471649 /var/tmp/spdk2.sock 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 1471649 ']' 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:25.348 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:25.348 15:10:03 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:25.348 [2024-11-20 15:10:03.889347] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:25.348 [2024-11-20 15:10:03.889416] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1471649 ] 00:08:25.349 [2024-11-20 15:10:04.007150] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1471636 has claimed it. 00:08:25.349 [2024-11-20 15:10:04.007195] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:08:25.915 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1471649) - No such process 00:08:25.915 ERROR: process (pid: 1471649) is no longer running 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 1471636 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 1471636 ']' 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 1471636 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:25.915 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1471636 00:08:26.174 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:26.174 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:26.174 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1471636' 00:08:26.174 killing process with pid 1471636 00:08:26.174 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 1471636 00:08:26.174 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 1471636 00:08:26.432 00:08:26.432 real 0m1.418s 00:08:26.432 user 0m3.930s 00:08:26.432 sys 0m0.435s 00:08:26.432 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:26.432 15:10:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:08:26.432 ************************************ 00:08:26.432 END TEST locking_overlapped_coremask 00:08:26.432 ************************************ 00:08:26.432 15:10:04 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:08:26.432 15:10:04 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:26.432 15:10:04 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:26.432 15:10:04 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:08:26.432 ************************************ 00:08:26.432 START TEST locking_overlapped_coremask_via_rpc 00:08:26.432 ************************************ 00:08:26.432 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:08:26.432 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1471850 00:08:26.433 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 1471850 /var/tmp/spdk.sock 00:08:26.433 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:08:26.433 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1471850 ']' 00:08:26.433 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:26.433 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:26.433 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:26.433 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:26.433 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:26.433 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:26.433 [2024-11-20 15:10:05.051114] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:26.433 [2024-11-20 15:10:05.051196] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1471850 ] 00:08:26.691 [2024-11-20 15:10:05.141258] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:26.691 [2024-11-20 15:10:05.141289] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:26.691 [2024-11-20 15:10:05.169394] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:08:26.691 [2024-11-20 15:10:05.169416] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:08:26.691 [2024-11-20 15:10:05.169423] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1471864 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 1471864 /var/tmp/spdk2.sock 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1471864 ']' 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:26.950 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:26.950 15:10:05 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:26.950 [2024-11-20 15:10:05.425188] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:26.950 [2024-11-20 15:10:05.425260] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1471864 ] 00:08:26.950 [2024-11-20 15:10:05.546649] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:26.950 [2024-11-20 15:10:05.546680] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:26.950 [2024-11-20 15:10:05.601037] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:08:26.950 [2024-11-20 15:10:05.601144] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:08:26.950 [2024-11-20 15:10:05.601147] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:27.886 [2024-11-20 15:10:06.322390] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1471850 has claimed it. 00:08:27.886 request: 00:08:27.886 { 00:08:27.886 "method": "framework_enable_cpumask_locks", 00:08:27.886 "req_id": 1 00:08:27.886 } 00:08:27.886 Got JSON-RPC error response 00:08:27.886 response: 00:08:27.886 { 00:08:27.886 "code": -32603, 00:08:27.886 "message": "Failed to claim CPU core: 2" 00:08:27.886 } 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:08:27.886 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 1471850 /var/tmp/spdk.sock 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1471850 ']' 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:27.887 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 1471864 /var/tmp/spdk2.sock 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1471864 ']' 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:27.887 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:27.887 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:28.145 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:28.145 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:08:28.145 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:08:28.145 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:08:28.145 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:08:28.145 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:08:28.145 00:08:28.145 real 0m1.723s 00:08:28.145 user 0m0.804s 00:08:28.145 sys 0m0.170s 00:08:28.145 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:28.145 15:10:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:28.145 ************************************ 00:08:28.145 END TEST locking_overlapped_coremask_via_rpc 00:08:28.145 ************************************ 00:08:28.145 15:10:06 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:08:28.145 15:10:06 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1471850 ]] 00:08:28.145 15:10:06 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1471850 00:08:28.145 15:10:06 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1471850 ']' 00:08:28.145 15:10:06 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1471850 00:08:28.145 15:10:06 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:08:28.145 15:10:06 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:28.145 15:10:06 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1471850 00:08:28.404 15:10:06 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:28.404 15:10:06 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:28.404 15:10:06 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1471850' 00:08:28.404 killing process with pid 1471850 00:08:28.404 15:10:06 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 1471850 00:08:28.404 15:10:06 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 1471850 00:08:28.663 15:10:07 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1471864 ]] 00:08:28.663 15:10:07 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1471864 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1471864 ']' 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1471864 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1471864 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1471864' 00:08:28.663 killing process with pid 1471864 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 1471864 00:08:28.663 15:10:07 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 1471864 00:08:28.921 15:10:07 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:08:28.921 15:10:07 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:08:28.921 15:10:07 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1471850 ]] 00:08:28.921 15:10:07 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1471850 00:08:28.921 15:10:07 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1471850 ']' 00:08:28.921 15:10:07 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1471850 00:08:28.921 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (1471850) - No such process 00:08:28.921 15:10:07 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 1471850 is not found' 00:08:28.921 Process with pid 1471850 is not found 00:08:28.921 15:10:07 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1471864 ]] 00:08:28.921 15:10:07 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1471864 00:08:28.921 15:10:07 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1471864 ']' 00:08:28.921 15:10:07 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1471864 00:08:28.922 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (1471864) - No such process 00:08:28.922 15:10:07 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 1471864 is not found' 00:08:28.922 Process with pid 1471864 is not found 00:08:28.922 15:10:07 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:08:28.922 00:08:28.922 real 0m15.705s 00:08:28.922 user 0m26.199s 00:08:28.922 sys 0m6.250s 00:08:28.922 15:10:07 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:28.922 15:10:07 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:08:28.922 ************************************ 00:08:28.922 END TEST cpu_locks 00:08:28.922 ************************************ 00:08:29.180 00:08:29.180 real 0m40.036s 00:08:29.180 user 1m13.742s 00:08:29.180 sys 0m10.692s 00:08:29.180 15:10:07 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:29.180 15:10:07 event -- common/autotest_common.sh@10 -- # set +x 00:08:29.180 ************************************ 00:08:29.180 END TEST event 00:08:29.180 ************************************ 00:08:29.180 15:10:07 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:08:29.180 15:10:07 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:29.180 15:10:07 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:29.180 15:10:07 -- common/autotest_common.sh@10 -- # set +x 00:08:29.180 ************************************ 00:08:29.180 START TEST thread 00:08:29.180 ************************************ 00:08:29.180 15:10:07 thread -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:08:29.180 * Looking for test storage... 00:08:29.180 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:08:29.180 15:10:07 thread -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:29.180 15:10:07 thread -- common/autotest_common.sh@1693 -- # lcov --version 00:08:29.180 15:10:07 thread -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:29.180 15:10:07 thread -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:29.180 15:10:07 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:29.180 15:10:07 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:29.180 15:10:07 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:29.180 15:10:07 thread -- scripts/common.sh@336 -- # IFS=.-: 00:08:29.181 15:10:07 thread -- scripts/common.sh@336 -- # read -ra ver1 00:08:29.181 15:10:07 thread -- scripts/common.sh@337 -- # IFS=.-: 00:08:29.181 15:10:07 thread -- scripts/common.sh@337 -- # read -ra ver2 00:08:29.181 15:10:07 thread -- scripts/common.sh@338 -- # local 'op=<' 00:08:29.181 15:10:07 thread -- scripts/common.sh@340 -- # ver1_l=2 00:08:29.181 15:10:07 thread -- scripts/common.sh@341 -- # ver2_l=1 00:08:29.181 15:10:07 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:29.181 15:10:07 thread -- scripts/common.sh@344 -- # case "$op" in 00:08:29.181 15:10:07 thread -- scripts/common.sh@345 -- # : 1 00:08:29.181 15:10:07 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:29.181 15:10:07 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:29.181 15:10:07 thread -- scripts/common.sh@365 -- # decimal 1 00:08:29.181 15:10:07 thread -- scripts/common.sh@353 -- # local d=1 00:08:29.181 15:10:07 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:29.440 15:10:07 thread -- scripts/common.sh@355 -- # echo 1 00:08:29.440 15:10:07 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:08:29.440 15:10:07 thread -- scripts/common.sh@366 -- # decimal 2 00:08:29.440 15:10:07 thread -- scripts/common.sh@353 -- # local d=2 00:08:29.440 15:10:07 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:29.440 15:10:07 thread -- scripts/common.sh@355 -- # echo 2 00:08:29.440 15:10:07 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:08:29.440 15:10:07 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:29.440 15:10:07 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:29.440 15:10:07 thread -- scripts/common.sh@368 -- # return 0 00:08:29.440 15:10:07 thread -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:29.440 15:10:07 thread -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:29.440 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:29.440 --rc genhtml_branch_coverage=1 00:08:29.440 --rc genhtml_function_coverage=1 00:08:29.440 --rc genhtml_legend=1 00:08:29.440 --rc geninfo_all_blocks=1 00:08:29.440 --rc geninfo_unexecuted_blocks=1 00:08:29.440 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:29.440 ' 00:08:29.440 15:10:07 thread -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:29.440 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:29.440 --rc genhtml_branch_coverage=1 00:08:29.440 --rc genhtml_function_coverage=1 00:08:29.440 --rc genhtml_legend=1 00:08:29.440 --rc geninfo_all_blocks=1 00:08:29.440 --rc geninfo_unexecuted_blocks=1 00:08:29.440 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:29.440 ' 00:08:29.440 15:10:07 thread -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:29.440 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:29.440 --rc genhtml_branch_coverage=1 00:08:29.440 --rc genhtml_function_coverage=1 00:08:29.440 --rc genhtml_legend=1 00:08:29.440 --rc geninfo_all_blocks=1 00:08:29.440 --rc geninfo_unexecuted_blocks=1 00:08:29.440 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:29.440 ' 00:08:29.440 15:10:07 thread -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:29.440 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:29.440 --rc genhtml_branch_coverage=1 00:08:29.440 --rc genhtml_function_coverage=1 00:08:29.440 --rc genhtml_legend=1 00:08:29.440 --rc geninfo_all_blocks=1 00:08:29.440 --rc geninfo_unexecuted_blocks=1 00:08:29.440 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:29.440 ' 00:08:29.440 15:10:07 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:29.440 15:10:07 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:08:29.440 15:10:07 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:29.440 15:10:07 thread -- common/autotest_common.sh@10 -- # set +x 00:08:29.440 ************************************ 00:08:29.440 START TEST thread_poller_perf 00:08:29.440 ************************************ 00:08:29.440 15:10:07 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:29.440 [2024-11-20 15:10:07.917687] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:29.440 [2024-11-20 15:10:07.917756] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1472313 ] 00:08:29.440 [2024-11-20 15:10:07.997575] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.440 [2024-11-20 15:10:08.021737] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.440 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:08:30.375 [2024-11-20T14:10:09.061Z] ====================================== 00:08:30.375 [2024-11-20T14:10:09.061Z] busy:2304784824 (cyc) 00:08:30.375 [2024-11-20T14:10:09.061Z] total_run_count: 817000 00:08:30.375 [2024-11-20T14:10:09.061Z] tsc_hz: 2300000000 (cyc) 00:08:30.375 [2024-11-20T14:10:09.061Z] ====================================== 00:08:30.375 [2024-11-20T14:10:09.061Z] poller_cost: 2821 (cyc), 1226 (nsec) 00:08:30.635 00:08:30.635 real 0m1.155s 00:08:30.635 user 0m1.062s 00:08:30.635 sys 0m0.088s 00:08:30.635 15:10:09 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:30.635 15:10:09 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:30.635 ************************************ 00:08:30.635 END TEST thread_poller_perf 00:08:30.635 ************************************ 00:08:30.635 15:10:09 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:30.635 15:10:09 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:08:30.635 15:10:09 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:30.635 15:10:09 thread -- common/autotest_common.sh@10 -- # set +x 00:08:30.635 ************************************ 00:08:30.635 START TEST thread_poller_perf 00:08:30.635 ************************************ 00:08:30.635 15:10:09 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:30.635 [2024-11-20 15:10:09.139056] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:30.635 [2024-11-20 15:10:09.139172] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1472510 ] 00:08:30.635 [2024-11-20 15:10:09.226646] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.635 [2024-11-20 15:10:09.252073] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.635 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:08:32.009 [2024-11-20T14:10:10.695Z] ====================================== 00:08:32.009 [2024-11-20T14:10:10.695Z] busy:2301240512 (cyc) 00:08:32.009 [2024-11-20T14:10:10.695Z] total_run_count: 11015000 00:08:32.009 [2024-11-20T14:10:10.695Z] tsc_hz: 2300000000 (cyc) 00:08:32.009 [2024-11-20T14:10:10.695Z] ====================================== 00:08:32.009 [2024-11-20T14:10:10.695Z] poller_cost: 208 (cyc), 90 (nsec) 00:08:32.009 00:08:32.009 real 0m1.170s 00:08:32.009 user 0m1.073s 00:08:32.009 sys 0m0.092s 00:08:32.009 15:10:10 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:32.009 15:10:10 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:32.009 ************************************ 00:08:32.009 END TEST thread_poller_perf 00:08:32.009 ************************************ 00:08:32.009 15:10:10 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:08:32.009 15:10:10 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:08:32.009 15:10:10 thread -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:32.009 15:10:10 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:32.009 15:10:10 thread -- common/autotest_common.sh@10 -- # set +x 00:08:32.009 ************************************ 00:08:32.009 START TEST thread_spdk_lock 00:08:32.009 ************************************ 00:08:32.009 15:10:10 thread.thread_spdk_lock -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:08:32.009 [2024-11-20 15:10:10.382372] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:32.009 [2024-11-20 15:10:10.382454] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1472703 ] 00:08:32.009 [2024-11-20 15:10:10.470340] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:32.009 [2024-11-20 15:10:10.497770] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:08:32.009 [2024-11-20 15:10:10.497772] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.577 [2024-11-20 15:10:10.987356] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:32.577 [2024-11-20 15:10:10.987395] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3112:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:08:32.577 [2024-11-20 15:10:10.987405] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3067:sspin_stacks_print: *ERROR*: spinlock 0x1367a80 00:08:32.577 [2024-11-20 15:10:10.988066] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:32.577 [2024-11-20 15:10:10.988168] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:32.577 [2024-11-20 15:10:10.988188] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:32.577 Starting test contend 00:08:32.577 Worker Delay Wait us Hold us Total us 00:08:32.577 0 3 172362 186591 358953 00:08:32.577 1 5 93446 286411 379858 00:08:32.577 PASS test contend 00:08:32.577 Starting test hold_by_poller 00:08:32.577 PASS test hold_by_poller 00:08:32.577 Starting test hold_by_message 00:08:32.577 PASS test hold_by_message 00:08:32.577 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:08:32.577 100014 assertions passed 00:08:32.577 0 assertions failed 00:08:32.577 00:08:32.577 real 0m0.659s 00:08:32.577 user 0m1.058s 00:08:32.577 sys 0m0.088s 00:08:32.577 15:10:11 thread.thread_spdk_lock -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:32.577 15:10:11 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:08:32.577 ************************************ 00:08:32.577 END TEST thread_spdk_lock 00:08:32.577 ************************************ 00:08:32.577 00:08:32.577 real 0m3.375s 00:08:32.577 user 0m3.366s 00:08:32.577 sys 0m0.518s 00:08:32.577 15:10:11 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:32.577 15:10:11 thread -- common/autotest_common.sh@10 -- # set +x 00:08:32.577 ************************************ 00:08:32.577 END TEST thread 00:08:32.577 ************************************ 00:08:32.577 15:10:11 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:08:32.577 15:10:11 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:08:32.577 15:10:11 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:32.577 15:10:11 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:32.577 15:10:11 -- common/autotest_common.sh@10 -- # set +x 00:08:32.577 ************************************ 00:08:32.577 START TEST app_cmdline 00:08:32.577 ************************************ 00:08:32.577 15:10:11 app_cmdline -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:08:32.577 * Looking for test storage... 00:08:32.577 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1693 -- # lcov --version 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@345 -- # : 1 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:32.835 15:10:11 app_cmdline -- scripts/common.sh@368 -- # return 0 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:32.835 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:32.835 --rc genhtml_branch_coverage=1 00:08:32.835 --rc genhtml_function_coverage=1 00:08:32.835 --rc genhtml_legend=1 00:08:32.835 --rc geninfo_all_blocks=1 00:08:32.835 --rc geninfo_unexecuted_blocks=1 00:08:32.835 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:32.835 ' 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:32.835 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:32.835 --rc genhtml_branch_coverage=1 00:08:32.835 --rc genhtml_function_coverage=1 00:08:32.835 --rc genhtml_legend=1 00:08:32.835 --rc geninfo_all_blocks=1 00:08:32.835 --rc geninfo_unexecuted_blocks=1 00:08:32.835 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:32.835 ' 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:32.835 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:32.835 --rc genhtml_branch_coverage=1 00:08:32.835 --rc genhtml_function_coverage=1 00:08:32.835 --rc genhtml_legend=1 00:08:32.835 --rc geninfo_all_blocks=1 00:08:32.835 --rc geninfo_unexecuted_blocks=1 00:08:32.835 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:32.835 ' 00:08:32.835 15:10:11 app_cmdline -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:32.835 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:32.835 --rc genhtml_branch_coverage=1 00:08:32.835 --rc genhtml_function_coverage=1 00:08:32.835 --rc genhtml_legend=1 00:08:32.835 --rc geninfo_all_blocks=1 00:08:32.835 --rc geninfo_unexecuted_blocks=1 00:08:32.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:32.836 ' 00:08:32.836 15:10:11 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:32.836 15:10:11 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1472943 00:08:32.836 15:10:11 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1472943 00:08:32.836 15:10:11 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:32.836 15:10:11 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 1472943 ']' 00:08:32.836 15:10:11 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:32.836 15:10:11 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:32.836 15:10:11 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:32.836 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:32.836 15:10:11 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:32.836 15:10:11 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:32.836 [2024-11-20 15:10:11.379883] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:32.836 [2024-11-20 15:10:11.379973] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1472943 ] 00:08:32.836 [2024-11-20 15:10:11.462591] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.836 [2024-11-20 15:10:11.487100] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.093 15:10:11 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:33.093 15:10:11 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:08:33.093 15:10:11 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:33.351 { 00:08:33.351 "version": "SPDK v25.01-pre git sha1 557f022f6", 00:08:33.351 "fields": { 00:08:33.351 "major": 25, 00:08:33.351 "minor": 1, 00:08:33.351 "patch": 0, 00:08:33.351 "suffix": "-pre", 00:08:33.351 "commit": "557f022f6" 00:08:33.351 } 00:08:33.351 } 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:33.351 15:10:11 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:33.351 15:10:11 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:33.351 15:10:11 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:33.351 15:10:11 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:33.351 15:10:11 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:08:33.352 15:10:11 app_cmdline -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:33.611 request: 00:08:33.611 { 00:08:33.611 "method": "env_dpdk_get_mem_stats", 00:08:33.611 "req_id": 1 00:08:33.611 } 00:08:33.611 Got JSON-RPC error response 00:08:33.611 response: 00:08:33.611 { 00:08:33.611 "code": -32601, 00:08:33.611 "message": "Method not found" 00:08:33.611 } 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:08:33.611 15:10:12 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1472943 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 1472943 ']' 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 1472943 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1472943 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1472943' 00:08:33.611 killing process with pid 1472943 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@973 -- # kill 1472943 00:08:33.611 15:10:12 app_cmdline -- common/autotest_common.sh@978 -- # wait 1472943 00:08:33.870 00:08:33.870 real 0m1.356s 00:08:33.870 user 0m1.508s 00:08:33.870 sys 0m0.515s 00:08:33.870 15:10:12 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:33.870 15:10:12 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:33.870 ************************************ 00:08:33.870 END TEST app_cmdline 00:08:33.870 ************************************ 00:08:33.870 15:10:12 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:08:34.128 15:10:12 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:34.128 15:10:12 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:34.128 15:10:12 -- common/autotest_common.sh@10 -- # set +x 00:08:34.129 ************************************ 00:08:34.129 START TEST version 00:08:34.129 ************************************ 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:08:34.129 * Looking for test storage... 00:08:34.129 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1693 -- # lcov --version 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:34.129 15:10:12 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:34.129 15:10:12 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:34.129 15:10:12 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:34.129 15:10:12 version -- scripts/common.sh@336 -- # IFS=.-: 00:08:34.129 15:10:12 version -- scripts/common.sh@336 -- # read -ra ver1 00:08:34.129 15:10:12 version -- scripts/common.sh@337 -- # IFS=.-: 00:08:34.129 15:10:12 version -- scripts/common.sh@337 -- # read -ra ver2 00:08:34.129 15:10:12 version -- scripts/common.sh@338 -- # local 'op=<' 00:08:34.129 15:10:12 version -- scripts/common.sh@340 -- # ver1_l=2 00:08:34.129 15:10:12 version -- scripts/common.sh@341 -- # ver2_l=1 00:08:34.129 15:10:12 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:34.129 15:10:12 version -- scripts/common.sh@344 -- # case "$op" in 00:08:34.129 15:10:12 version -- scripts/common.sh@345 -- # : 1 00:08:34.129 15:10:12 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:34.129 15:10:12 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:34.129 15:10:12 version -- scripts/common.sh@365 -- # decimal 1 00:08:34.129 15:10:12 version -- scripts/common.sh@353 -- # local d=1 00:08:34.129 15:10:12 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:34.129 15:10:12 version -- scripts/common.sh@355 -- # echo 1 00:08:34.129 15:10:12 version -- scripts/common.sh@365 -- # ver1[v]=1 00:08:34.129 15:10:12 version -- scripts/common.sh@366 -- # decimal 2 00:08:34.129 15:10:12 version -- scripts/common.sh@353 -- # local d=2 00:08:34.129 15:10:12 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:34.129 15:10:12 version -- scripts/common.sh@355 -- # echo 2 00:08:34.129 15:10:12 version -- scripts/common.sh@366 -- # ver2[v]=2 00:08:34.129 15:10:12 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:34.129 15:10:12 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:34.129 15:10:12 version -- scripts/common.sh@368 -- # return 0 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:34.129 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.129 --rc genhtml_branch_coverage=1 00:08:34.129 --rc genhtml_function_coverage=1 00:08:34.129 --rc genhtml_legend=1 00:08:34.129 --rc geninfo_all_blocks=1 00:08:34.129 --rc geninfo_unexecuted_blocks=1 00:08:34.129 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.129 ' 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:34.129 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.129 --rc genhtml_branch_coverage=1 00:08:34.129 --rc genhtml_function_coverage=1 00:08:34.129 --rc genhtml_legend=1 00:08:34.129 --rc geninfo_all_blocks=1 00:08:34.129 --rc geninfo_unexecuted_blocks=1 00:08:34.129 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.129 ' 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:34.129 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.129 --rc genhtml_branch_coverage=1 00:08:34.129 --rc genhtml_function_coverage=1 00:08:34.129 --rc genhtml_legend=1 00:08:34.129 --rc geninfo_all_blocks=1 00:08:34.129 --rc geninfo_unexecuted_blocks=1 00:08:34.129 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.129 ' 00:08:34.129 15:10:12 version -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:34.129 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.129 --rc genhtml_branch_coverage=1 00:08:34.129 --rc genhtml_function_coverage=1 00:08:34.129 --rc genhtml_legend=1 00:08:34.129 --rc geninfo_all_blocks=1 00:08:34.129 --rc geninfo_unexecuted_blocks=1 00:08:34.129 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.129 ' 00:08:34.129 15:10:12 version -- app/version.sh@17 -- # get_header_version major 00:08:34.129 15:10:12 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:34.129 15:10:12 version -- app/version.sh@14 -- # cut -f2 00:08:34.129 15:10:12 version -- app/version.sh@14 -- # tr -d '"' 00:08:34.129 15:10:12 version -- app/version.sh@17 -- # major=25 00:08:34.129 15:10:12 version -- app/version.sh@18 -- # get_header_version minor 00:08:34.129 15:10:12 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:34.129 15:10:12 version -- app/version.sh@14 -- # cut -f2 00:08:34.129 15:10:12 version -- app/version.sh@14 -- # tr -d '"' 00:08:34.129 15:10:12 version -- app/version.sh@18 -- # minor=1 00:08:34.129 15:10:12 version -- app/version.sh@19 -- # get_header_version patch 00:08:34.129 15:10:12 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:34.129 15:10:12 version -- app/version.sh@14 -- # tr -d '"' 00:08:34.129 15:10:12 version -- app/version.sh@14 -- # cut -f2 00:08:34.129 15:10:12 version -- app/version.sh@19 -- # patch=0 00:08:34.129 15:10:12 version -- app/version.sh@20 -- # get_header_version suffix 00:08:34.388 15:10:12 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:34.388 15:10:12 version -- app/version.sh@14 -- # cut -f2 00:08:34.388 15:10:12 version -- app/version.sh@14 -- # tr -d '"' 00:08:34.388 15:10:12 version -- app/version.sh@20 -- # suffix=-pre 00:08:34.388 15:10:12 version -- app/version.sh@22 -- # version=25.1 00:08:34.388 15:10:12 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:34.388 15:10:12 version -- app/version.sh@28 -- # version=25.1rc0 00:08:34.388 15:10:12 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:34.388 15:10:12 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:34.388 15:10:12 version -- app/version.sh@30 -- # py_version=25.1rc0 00:08:34.388 15:10:12 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:08:34.388 00:08:34.388 real 0m0.264s 00:08:34.388 user 0m0.158s 00:08:34.388 sys 0m0.156s 00:08:34.388 15:10:12 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:34.388 15:10:12 version -- common/autotest_common.sh@10 -- # set +x 00:08:34.388 ************************************ 00:08:34.388 END TEST version 00:08:34.388 ************************************ 00:08:34.388 15:10:12 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:08:34.388 15:10:12 -- spdk/autotest.sh@194 -- # uname -s 00:08:34.388 15:10:12 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:08:34.388 15:10:12 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:08:34.388 15:10:12 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:08:34.388 15:10:12 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@260 -- # timing_exit lib 00:08:34.388 15:10:12 -- common/autotest_common.sh@732 -- # xtrace_disable 00:08:34.388 15:10:12 -- common/autotest_common.sh@10 -- # set +x 00:08:34.388 15:10:12 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:08:34.388 15:10:12 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:08:34.388 15:10:12 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:08:34.388 15:10:12 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:08:34.388 15:10:12 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:08:34.388 15:10:12 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:34.388 15:10:12 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:34.388 15:10:12 -- common/autotest_common.sh@10 -- # set +x 00:08:34.388 ************************************ 00:08:34.388 START TEST llvm_fuzz 00:08:34.388 ************************************ 00:08:34.388 15:10:12 llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:08:34.648 * Looking for test storage... 00:08:34.648 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:08:34.648 15:10:13 llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:34.648 15:10:13 llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:34.648 15:10:13 llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:34.648 15:10:13 llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:34.648 15:10:13 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:34.649 15:10:13 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:34.649 15:10:13 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:34.649 15:10:13 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:34.649 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.649 --rc genhtml_branch_coverage=1 00:08:34.649 --rc genhtml_function_coverage=1 00:08:34.649 --rc genhtml_legend=1 00:08:34.649 --rc geninfo_all_blocks=1 00:08:34.649 --rc geninfo_unexecuted_blocks=1 00:08:34.649 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.649 ' 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:34.649 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.649 --rc genhtml_branch_coverage=1 00:08:34.649 --rc genhtml_function_coverage=1 00:08:34.649 --rc genhtml_legend=1 00:08:34.649 --rc geninfo_all_blocks=1 00:08:34.649 --rc geninfo_unexecuted_blocks=1 00:08:34.649 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.649 ' 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:34.649 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.649 --rc genhtml_branch_coverage=1 00:08:34.649 --rc genhtml_function_coverage=1 00:08:34.649 --rc genhtml_legend=1 00:08:34.649 --rc geninfo_all_blocks=1 00:08:34.649 --rc geninfo_unexecuted_blocks=1 00:08:34.649 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.649 ' 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:34.649 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.649 --rc genhtml_branch_coverage=1 00:08:34.649 --rc genhtml_function_coverage=1 00:08:34.649 --rc genhtml_legend=1 00:08:34.649 --rc geninfo_all_blocks=1 00:08:34.649 --rc geninfo_unexecuted_blocks=1 00:08:34.649 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.649 ' 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@550 -- # fuzzers=() 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@550 -- # local fuzzers 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@552 -- # [[ -n '' ]] 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@555 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@556 -- # fuzzers=("${fuzzers[@]##*/}") 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@559 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:34.649 15:10:13 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:34.649 15:10:13 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:34.649 ************************************ 00:08:34.649 START TEST nvmf_llvm_fuzz 00:08:34.649 ************************************ 00:08:34.649 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:34.649 * Looking for test storage... 00:08:34.911 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:34.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.911 --rc genhtml_branch_coverage=1 00:08:34.911 --rc genhtml_function_coverage=1 00:08:34.911 --rc genhtml_legend=1 00:08:34.911 --rc geninfo_all_blocks=1 00:08:34.911 --rc geninfo_unexecuted_blocks=1 00:08:34.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.911 ' 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:34.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.911 --rc genhtml_branch_coverage=1 00:08:34.911 --rc genhtml_function_coverage=1 00:08:34.911 --rc genhtml_legend=1 00:08:34.911 --rc geninfo_all_blocks=1 00:08:34.911 --rc geninfo_unexecuted_blocks=1 00:08:34.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.911 ' 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:34.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.911 --rc genhtml_branch_coverage=1 00:08:34.911 --rc genhtml_function_coverage=1 00:08:34.911 --rc genhtml_legend=1 00:08:34.911 --rc geninfo_all_blocks=1 00:08:34.911 --rc geninfo_unexecuted_blocks=1 00:08:34.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.911 ' 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:34.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.911 --rc genhtml_branch_coverage=1 00:08:34.911 --rc genhtml_function_coverage=1 00:08:34.911 --rc genhtml_legend=1 00:08:34.911 --rc geninfo_all_blocks=1 00:08:34.911 --rc geninfo_unexecuted_blocks=1 00:08:34.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.911 ' 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:34.911 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:34.912 #define SPDK_CONFIG_H 00:08:34.912 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:34.912 #define SPDK_CONFIG_APPS 1 00:08:34.912 #define SPDK_CONFIG_ARCH native 00:08:34.912 #undef SPDK_CONFIG_ASAN 00:08:34.912 #undef SPDK_CONFIG_AVAHI 00:08:34.912 #undef SPDK_CONFIG_CET 00:08:34.912 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:34.912 #define SPDK_CONFIG_COVERAGE 1 00:08:34.912 #define SPDK_CONFIG_CROSS_PREFIX 00:08:34.912 #undef SPDK_CONFIG_CRYPTO 00:08:34.912 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:34.912 #undef SPDK_CONFIG_CUSTOMOCF 00:08:34.912 #undef SPDK_CONFIG_DAOS 00:08:34.912 #define SPDK_CONFIG_DAOS_DIR 00:08:34.912 #define SPDK_CONFIG_DEBUG 1 00:08:34.912 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:34.912 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:34.912 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:34.912 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:34.912 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:34.912 #undef SPDK_CONFIG_DPDK_UADK 00:08:34.912 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:34.912 #define SPDK_CONFIG_EXAMPLES 1 00:08:34.912 #undef SPDK_CONFIG_FC 00:08:34.912 #define SPDK_CONFIG_FC_PATH 00:08:34.912 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:34.912 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:34.912 #define SPDK_CONFIG_FSDEV 1 00:08:34.912 #undef SPDK_CONFIG_FUSE 00:08:34.912 #define SPDK_CONFIG_FUZZER 1 00:08:34.912 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:34.912 #undef SPDK_CONFIG_GOLANG 00:08:34.912 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:34.912 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:34.912 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:34.912 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:34.912 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:34.912 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:34.912 #undef SPDK_CONFIG_HAVE_LZ4 00:08:34.912 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:34.912 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:34.912 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:34.912 #define SPDK_CONFIG_IDXD 1 00:08:34.912 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:34.912 #undef SPDK_CONFIG_IPSEC_MB 00:08:34.912 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:34.912 #define SPDK_CONFIG_ISAL 1 00:08:34.912 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:34.912 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:34.912 #define SPDK_CONFIG_LIBDIR 00:08:34.912 #undef SPDK_CONFIG_LTO 00:08:34.912 #define SPDK_CONFIG_MAX_LCORES 128 00:08:34.912 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:34.912 #define SPDK_CONFIG_NVME_CUSE 1 00:08:34.912 #undef SPDK_CONFIG_OCF 00:08:34.912 #define SPDK_CONFIG_OCF_PATH 00:08:34.912 #define SPDK_CONFIG_OPENSSL_PATH 00:08:34.912 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:34.912 #define SPDK_CONFIG_PGO_DIR 00:08:34.912 #undef SPDK_CONFIG_PGO_USE 00:08:34.912 #define SPDK_CONFIG_PREFIX /usr/local 00:08:34.912 #undef SPDK_CONFIG_RAID5F 00:08:34.912 #undef SPDK_CONFIG_RBD 00:08:34.912 #define SPDK_CONFIG_RDMA 1 00:08:34.912 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:34.912 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:34.912 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:34.912 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:34.912 #undef SPDK_CONFIG_SHARED 00:08:34.912 #undef SPDK_CONFIG_SMA 00:08:34.912 #define SPDK_CONFIG_TESTS 1 00:08:34.912 #undef SPDK_CONFIG_TSAN 00:08:34.912 #define SPDK_CONFIG_UBLK 1 00:08:34.912 #define SPDK_CONFIG_UBSAN 1 00:08:34.912 #undef SPDK_CONFIG_UNIT_TESTS 00:08:34.912 #undef SPDK_CONFIG_URING 00:08:34.912 #define SPDK_CONFIG_URING_PATH 00:08:34.912 #undef SPDK_CONFIG_URING_ZNS 00:08:34.912 #undef SPDK_CONFIG_USDT 00:08:34.912 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:34.912 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:34.912 #define SPDK_CONFIG_VFIO_USER 1 00:08:34.912 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:34.912 #define SPDK_CONFIG_VHOST 1 00:08:34.912 #define SPDK_CONFIG_VIRTIO 1 00:08:34.912 #undef SPDK_CONFIG_VTUNE 00:08:34.912 #define SPDK_CONFIG_VTUNE_DIR 00:08:34.912 #define SPDK_CONFIG_WERROR 1 00:08:34.912 #define SPDK_CONFIG_WPDK_DIR 00:08:34.912 #undef SPDK_CONFIG_XNVME 00:08:34.912 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:34.912 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:34.913 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:34.914 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j72 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 1473295 ]] 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 1473295 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.O9M4a6 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.O9M4a6/tests/nvmf /tmp/spdk.O9M4a6 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=84510343168 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=94500274176 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=9989931008 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=47245373440 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=47250137088 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=4763648 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=18893950976 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=18900058112 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=6107136 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=47249588224 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=47250137088 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=548864 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=9450012672 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=9450024960 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:08:34.915 * Looking for test storage... 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:08:34.915 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=84510343168 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=12204523520 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:34.916 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:34.916 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:35.175 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.175 --rc genhtml_branch_coverage=1 00:08:35.175 --rc genhtml_function_coverage=1 00:08:35.175 --rc genhtml_legend=1 00:08:35.175 --rc geninfo_all_blocks=1 00:08:35.175 --rc geninfo_unexecuted_blocks=1 00:08:35.175 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.175 ' 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:35.175 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.175 --rc genhtml_branch_coverage=1 00:08:35.175 --rc genhtml_function_coverage=1 00:08:35.175 --rc genhtml_legend=1 00:08:35.175 --rc geninfo_all_blocks=1 00:08:35.175 --rc geninfo_unexecuted_blocks=1 00:08:35.175 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.175 ' 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:35.175 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.175 --rc genhtml_branch_coverage=1 00:08:35.175 --rc genhtml_function_coverage=1 00:08:35.175 --rc genhtml_legend=1 00:08:35.175 --rc geninfo_all_blocks=1 00:08:35.175 --rc geninfo_unexecuted_blocks=1 00:08:35.175 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.175 ' 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:35.175 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.175 --rc genhtml_branch_coverage=1 00:08:35.175 --rc genhtml_function_coverage=1 00:08:35.175 --rc genhtml_legend=1 00:08:35.175 --rc geninfo_all_blocks=1 00:08:35.175 --rc geninfo_unexecuted_blocks=1 00:08:35.175 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.175 ' 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:35.175 15:10:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:08:35.175 [2024-11-20 15:10:13.689295] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:35.175 [2024-11-20 15:10:13.689371] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1473422 ] 00:08:35.434 [2024-11-20 15:10:13.913186] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.434 [2024-11-20 15:10:13.928564] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.434 [2024-11-20 15:10:13.981385] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:35.434 [2024-11-20 15:10:13.997638] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:08:35.434 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.434 INFO: Seed: 2308887388 00:08:35.434 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:35.434 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:35.434 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:35.434 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.434 #2 INITED exec/s: 0 rss: 66Mb 00:08:35.434 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.434 This may also happen if the target rejected all inputs we tried so far 00:08:35.434 [2024-11-20 15:10:14.052513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.434 [2024-11-20 15:10:14.052551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.950 NEW_FUNC[1/714]: 0x459648 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:08:35.950 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:35.950 #13 NEW cov: 12172 ft: 12165 corp: 2/75b lim: 320 exec/s: 0 rss: 73Mb L: 74/74 MS: 1 InsertRepeatedBytes- 00:08:35.950 [2024-11-20 15:10:14.423468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.950 [2024-11-20 15:10:14.423513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.950 #14 NEW cov: 12285 ft: 12763 corp: 3/149b lim: 320 exec/s: 0 rss: 74Mb L: 74/74 MS: 1 ShuffleBytes- 00:08:35.950 [2024-11-20 15:10:14.523614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (da) qid:0 cid:4 nsid:2020202 cdw10:02020202 cdw11:02020202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202020202020202 00:08:35.950 [2024-11-20 15:10:14.523653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.950 NEW_FUNC[1/1]: 0x198e048 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:08:35.950 #22 NEW cov: 12315 ft: 13092 corp: 4/275b lim: 320 exec/s: 0 rss: 74Mb L: 126/126 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:35.950 [2024-11-20 15:10:14.583745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (da) qid:0 cid:4 nsid:2020202 cdw10:02020202 cdw11:02020202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202020202020202 00:08:35.950 [2024-11-20 15:10:14.583782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.208 #23 NEW cov: 12400 ft: 13339 corp: 5/401b lim: 320 exec/s: 0 rss: 74Mb L: 126/126 MS: 1 ChangeByte- 00:08:36.208 [2024-11-20 15:10:14.673996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.208 [2024-11-20 15:10:14.674029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.208 [2024-11-20 15:10:14.674065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:5 nsid:2b2b2b2b cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:36.208 [2024-11-20 15:10:14.674083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.208 NEW_FUNC[1/1]: 0x1558238 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2213 00:08:36.208 #24 NEW cov: 12431 ft: 13627 corp: 6/551b lim: 320 exec/s: 0 rss: 74Mb L: 150/150 MS: 1 InsertRepeatedBytes- 00:08:36.208 [2024-11-20 15:10:14.744134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.208 [2024-11-20 15:10:14.744165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.208 #29 NEW cov: 12448 ft: 13729 corp: 7/638b lim: 320 exec/s: 0 rss: 74Mb L: 87/150 MS: 5 CopyPart-InsertByte-CopyPart-EraseBytes-InsertRepeatedBytes- 00:08:36.208 [2024-11-20 15:10:14.794388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.208 [2024-11-20 15:10:14.794420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.208 [2024-11-20 15:10:14.794461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:5 nsid:2b2b2b2b cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:36.208 [2024-11-20 15:10:14.794478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.208 [2024-11-20 15:10:14.794507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b 00:08:36.208 [2024-11-20 15:10:14.794524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:36.208 [2024-11-20 15:10:14.794556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:7 nsid:2b2b2b2b cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:36.208 [2024-11-20 15:10:14.794572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:36.208 #30 NEW cov: 12448 ft: 14123 corp: 8/911b lim: 320 exec/s: 0 rss: 74Mb L: 273/273 MS: 1 CopyPart- 00:08:36.209 [2024-11-20 15:10:14.884575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.209 [2024-11-20 15:10:14.884608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.209 [2024-11-20 15:10:14.884644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:5 nsid:2b2b2b2b cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:36.209 [2024-11-20 15:10:14.884660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.209 [2024-11-20 15:10:14.884688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b 00:08:36.209 [2024-11-20 15:10:14.884704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:36.209 [2024-11-20 15:10:14.884734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:7 nsid:2b2b2b2b cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:36.209 [2024-11-20 15:10:14.884750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:36.467 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:36.467 #36 NEW cov: 12471 ft: 14206 corp: 9/1184b lim: 320 exec/s: 0 rss: 74Mb L: 273/273 MS: 1 ShuffleBytes- 00:08:36.467 [2024-11-20 15:10:14.984772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.467 [2024-11-20 15:10:14.984807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.467 #37 NEW cov: 12471 ft: 14227 corp: 10/1272b lim: 320 exec/s: 37 rss: 74Mb L: 88/273 MS: 1 InsertByte- 00:08:36.467 [2024-11-20 15:10:15.085005] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x202020202020202 00:08:36.467 [2024-11-20 15:10:15.085184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (da) qid:0 cid:4 nsid:2020202 cdw10:02020202 cdw11:02020202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202020202020202 00:08:36.467 [2024-11-20 15:10:15.085208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.467 [2024-11-20 15:10:15.085243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:2020202 cdw10:02020202 cdw11:02020202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202020202020202 00:08:36.467 [2024-11-20 15:10:15.085259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.467 [2024-11-20 15:10:15.085295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ca) qid:0 cid:6 nsid:cacacaca cdw10:cacacaca cdw11:cacacaca SGL TRANSPORT DATA BLOCK TRANSPORT 0xcacacacacacacaca 00:08:36.467 [2024-11-20 15:10:15.085312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:36.467 NEW_FUNC[1/1]: 0x132ee38 in nvmf_ctrlr_get_log_page /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:2636 00:08:36.467 #43 NEW cov: 12501 ft: 14497 corp: 11/1465b lim: 320 exec/s: 43 rss: 74Mb L: 193/273 MS: 1 InsertRepeatedBytes- 00:08:36.725 [2024-11-20 15:10:15.155273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.725 [2024-11-20 15:10:15.155307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.725 [2024-11-20 15:10:15.155353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:5 nsid:2b2b2b2b cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:36.725 [2024-11-20 15:10:15.155371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.725 #49 NEW cov: 12501 ft: 14530 corp: 12/1615b lim: 320 exec/s: 49 rss: 74Mb L: 150/273 MS: 1 ShuffleBytes- 00:08:36.725 [2024-11-20 15:10:15.215416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (da) qid:0 cid:4 nsid:2020202 cdw10:95959595 cdw11:95959595 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202020202020202 00:08:36.725 [2024-11-20 15:10:15.215451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.725 [2024-11-20 15:10:15.215488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (95) qid:0 cid:5 nsid:95959595 cdw10:02020202 cdw11:02020202 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.725 [2024-11-20 15:10:15.215504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.725 NEW_FUNC[1/1]: 0x198ebb8 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:08:36.725 #50 NEW cov: 12514 ft: 14894 corp: 13/1796b lim: 320 exec/s: 50 rss: 74Mb L: 181/273 MS: 1 InsertRepeatedBytes- 00:08:36.725 [2024-11-20 15:10:15.275509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.725 [2024-11-20 15:10:15.275541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.725 #51 NEW cov: 12514 ft: 14946 corp: 14/1923b lim: 320 exec/s: 51 rss: 74Mb L: 127/273 MS: 1 CopyPart- 00:08:36.725 [2024-11-20 15:10:15.365745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.725 [2024-11-20 15:10:15.365777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.725 #52 NEW cov: 12514 ft: 14973 corp: 15/1998b lim: 320 exec/s: 52 rss: 74Mb L: 75/273 MS: 1 InsertByte- 00:08:36.984 [2024-11-20 15:10:15.415853] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x202020202020202 00:08:36.984 [2024-11-20 15:10:15.416010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:2020202 cdw10:02020202 cdw11:02020202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202025d02020202 00:08:36.984 [2024-11-20 15:10:15.416037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.984 #53 NEW cov: 12518 ft: 15079 corp: 16/2182b lim: 320 exec/s: 53 rss: 74Mb L: 184/273 MS: 1 CrossOver- 00:08:36.984 [2024-11-20 15:10:15.475983] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x202020202020202 00:08:36.984 [2024-11-20 15:10:15.476107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (da) qid:0 cid:4 nsid:2020202 cdw10:02020202 cdw11:02020202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202020202020202 00:08:36.984 [2024-11-20 15:10:15.476135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.984 [2024-11-20 15:10:15.476167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:2025d02 cdw10:02020202 cdw11:02020202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202020202020202 00:08:36.984 [2024-11-20 15:10:15.476183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.984 #54 NEW cov: 12518 ft: 15126 corp: 17/2312b lim: 320 exec/s: 54 rss: 74Mb L: 130/273 MS: 1 CrossOver- 00:08:36.984 [2024-11-20 15:10:15.556951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.984 [2024-11-20 15:10:15.556980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.984 [2024-11-20 15:10:15.557035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.984 [2024-11-20 15:10:15.557049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.984 #55 NEW cov: 12518 ft: 15328 corp: 18/2445b lim: 320 exec/s: 55 rss: 74Mb L: 133/273 MS: 1 CopyPart- 00:08:36.984 [2024-11-20 15:10:15.597040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:20202020 cdw10:00000000 cdw11:00000000 00:08:36.984 [2024-11-20 15:10:15.597069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.984 [2024-11-20 15:10:15.597121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.984 [2024-11-20 15:10:15.597135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.984 #61 NEW cov: 12518 ft: 15350 corp: 19/2599b lim: 320 exec/s: 61 rss: 74Mb L: 154/273 MS: 1 InsertRepeatedBytes- 00:08:36.984 [2024-11-20 15:10:15.657400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:36.984 [2024-11-20 15:10:15.657427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.984 [2024-11-20 15:10:15.657488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:5 nsid:2b2b2b2b cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:36.984 [2024-11-20 15:10:15.657503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.984 [2024-11-20 15:10:15.657556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b 00:08:36.984 [2024-11-20 15:10:15.657572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:36.984 [2024-11-20 15:10:15.657632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:7 nsid:2b2b2b2b cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:36.984 [2024-11-20 15:10:15.657646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.242 #62 NEW cov: 12518 ft: 15385 corp: 20/2872b lim: 320 exec/s: 62 rss: 74Mb L: 273/273 MS: 1 ChangeBinInt- 00:08:37.242 [2024-11-20 15:10:15.717585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:37.242 [2024-11-20 15:10:15.717612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.242 [2024-11-20 15:10:15.717673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:5 nsid:2b2b2b2b cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:37.242 [2024-11-20 15:10:15.717688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.242 [2024-11-20 15:10:15.717743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b 00:08:37.242 [2024-11-20 15:10:15.717758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.242 [2024-11-20 15:10:15.717819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:7 nsid:2b2b2b2b cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:37.243 [2024-11-20 15:10:15.717833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.243 #63 NEW cov: 12518 ft: 15460 corp: 21/3145b lim: 320 exec/s: 63 rss: 74Mb L: 273/273 MS: 1 CopyPart- 00:08:37.243 [2024-11-20 15:10:15.777570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:20202020 cdw10:00000000 cdw11:00000000 00:08:37.243 [2024-11-20 15:10:15.777597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.243 [2024-11-20 15:10:15.777656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (9c) qid:0 cid:5 nsid:9c9c9c9c cdw10:00000000 cdw11:00000000 00:08:37.243 [2024-11-20 15:10:15.777671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.243 #64 NEW cov: 12518 ft: 15496 corp: 22/3332b lim: 320 exec/s: 64 rss: 74Mb L: 187/273 MS: 1 InsertRepeatedBytes- 00:08:37.243 [2024-11-20 15:10:15.837919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:37.243 [2024-11-20 15:10:15.837946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.243 [2024-11-20 15:10:15.838009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:5 nsid:2b2b2b2b cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:37.243 [2024-11-20 15:10:15.838025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.243 [2024-11-20 15:10:15.838079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b 00:08:37.243 [2024-11-20 15:10:15.838094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.243 [2024-11-20 15:10:15.838154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:7 nsid:2b2b2b2b cdw10:2b2b2bd1 cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:37.243 [2024-11-20 15:10:15.838168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.243 #65 NEW cov: 12518 ft: 15563 corp: 23/3606b lim: 320 exec/s: 65 rss: 74Mb L: 274/274 MS: 1 InsertByte- 00:08:37.243 [2024-11-20 15:10:15.877909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:37.243 [2024-11-20 15:10:15.877935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.243 [2024-11-20 15:10:15.877997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:5 nsid:2b2b2b2b cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b2b2b2b2b2b2b2b 00:08:37.243 [2024-11-20 15:10:15.878012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.243 [2024-11-20 15:10:15.878073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.243 [2024-11-20 15:10:15.878087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.243 #66 NEW cov: 12518 ft: 15573 corp: 24/3840b lim: 320 exec/s: 66 rss: 75Mb L: 234/274 MS: 1 InsertRepeatedBytes- 00:08:37.501 [2024-11-20 15:10:15.937922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.501 [2024-11-20 15:10:15.937949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.501 #67 NEW cov: 12518 ft: 15667 corp: 25/3927b lim: 320 exec/s: 67 rss: 75Mb L: 87/274 MS: 1 ChangeByte- 00:08:37.501 [2024-11-20 15:10:15.978038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (da) qid:0 cid:4 nsid:2020202 cdw10:02020202 cdw11:02020202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x202020202020202 00:08:37.501 [2024-11-20 15:10:15.978065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.501 #68 NEW cov: 12518 ft: 15688 corp: 26/4053b lim: 320 exec/s: 68 rss: 75Mb L: 126/274 MS: 1 ChangeBit- 00:08:37.501 [2024-11-20 15:10:16.018256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:37.501 [2024-11-20 15:10:16.018282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.501 [2024-11-20 15:10:16.018340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:2b000000 cdw10:2b2b2b2b cdw11:2b2b2b2b 00:08:37.501 [2024-11-20 15:10:16.018355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.501 #69 NEW cov: 12518 ft: 15735 corp: 27/4203b lim: 320 exec/s: 34 rss: 75Mb L: 150/274 MS: 1 CopyPart- 00:08:37.501 #69 DONE cov: 12518 ft: 15735 corp: 27/4203b lim: 320 exec/s: 34 rss: 75Mb 00:08:37.501 Done 69 runs in 2 second(s) 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:37.501 15:10:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:08:37.760 [2024-11-20 15:10:16.189131] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:37.760 [2024-11-20 15:10:16.189218] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1473713 ] 00:08:37.760 [2024-11-20 15:10:16.393129] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.760 [2024-11-20 15:10:16.407874] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.018 [2024-11-20 15:10:16.460682] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.018 [2024-11-20 15:10:16.476914] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:08:38.018 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.018 INFO: Seed: 491924447 00:08:38.018 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:38.018 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:38.018 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:38.018 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.018 #2 INITED exec/s: 0 rss: 66Mb 00:08:38.018 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.018 This may also happen if the target rejected all inputs we tried so far 00:08:38.018 [2024-11-20 15:10:16.543725] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.018 [2024-11-20 15:10:16.544492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.018 [2024-11-20 15:10:16.544537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.018 [2024-11-20 15:10:16.544644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.018 [2024-11-20 15:10:16.544661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.276 NEW_FUNC[1/716]: 0x459f48 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:08:38.276 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:38.276 #4 NEW cov: 12306 ft: 12305 corp: 2/15b lim: 30 exec/s: 0 rss: 73Mb L: 14/14 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:38.276 [2024-11-20 15:10:16.894557] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.276 [2024-11-20 15:10:16.895541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.276 [2024-11-20 15:10:16.895586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.276 [2024-11-20 15:10:16.895686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.276 [2024-11-20 15:10:16.895707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.276 [2024-11-20 15:10:16.895807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.276 [2024-11-20 15:10:16.895825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.276 #5 NEW cov: 12419 ft: 13321 corp: 3/35b lim: 30 exec/s: 0 rss: 73Mb L: 20/20 MS: 1 CopyPart- 00:08:38.534 [2024-11-20 15:10:16.964838] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.534 [2024-11-20 15:10:16.965867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.534 [2024-11-20 15:10:16.965894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.534 [2024-11-20 15:10:16.965989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.534 [2024-11-20 15:10:16.966005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.534 [2024-11-20 15:10:16.966100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.534 [2024-11-20 15:10:16.966114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.534 #6 NEW cov: 12425 ft: 13467 corp: 4/55b lim: 30 exec/s: 0 rss: 74Mb L: 20/20 MS: 1 ShuffleBytes- 00:08:38.534 [2024-11-20 15:10:17.035018] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.534 [2024-11-20 15:10:17.035551] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:38.534 [2024-11-20 15:10:17.036032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.534 [2024-11-20 15:10:17.036058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.534 [2024-11-20 15:10:17.036149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.534 [2024-11-20 15:10:17.036165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.534 [2024-11-20 15:10:17.036258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.534 [2024-11-20 15:10:17.036273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.534 #7 NEW cov: 12510 ft: 13771 corp: 5/75b lim: 30 exec/s: 0 rss: 74Mb L: 20/20 MS: 1 ChangeByte- 00:08:38.535 [2024-11-20 15:10:17.085292] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.535 [2024-11-20 15:10:17.086309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.086339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 [2024-11-20 15:10:17.086424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.086440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.535 [2024-11-20 15:10:17.086528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.086543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.535 #8 NEW cov: 12510 ft: 13818 corp: 6/95b lim: 30 exec/s: 0 rss: 74Mb L: 20/20 MS: 1 ShuffleBytes- 00:08:38.535 [2024-11-20 15:10:17.135533] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.535 [2024-11-20 15:10:17.136032] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:38.535 [2024-11-20 15:10:17.136507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.136538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 [2024-11-20 15:10:17.136627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.136641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.535 [2024-11-20 15:10:17.136741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.136757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.535 #9 NEW cov: 12510 ft: 13869 corp: 7/115b lim: 30 exec/s: 0 rss: 74Mb L: 20/20 MS: 1 CopyPart- 00:08:38.535 [2024-11-20 15:10:17.205759] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.535 [2024-11-20 15:10:17.206765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f0000a2 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.206791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 [2024-11-20 15:10:17.206885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.206900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.535 [2024-11-20 15:10:17.206988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.535 [2024-11-20 15:10:17.207004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.793 #10 NEW cov: 12510 ft: 13996 corp: 8/136b lim: 30 exec/s: 0 rss: 74Mb L: 21/21 MS: 1 InsertByte- 00:08:38.793 [2024-11-20 15:10:17.276034] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.793 [2024-11-20 15:10:17.276536] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:38.793 [2024-11-20 15:10:17.277025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.793 [2024-11-20 15:10:17.277053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.793 [2024-11-20 15:10:17.277143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.793 [2024-11-20 15:10:17.277157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.793 [2024-11-20 15:10:17.277244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.793 [2024-11-20 15:10:17.277258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.793 #11 NEW cov: 12510 ft: 14064 corp: 9/156b lim: 30 exec/s: 0 rss: 74Mb L: 20/21 MS: 1 CopyPart- 00:08:38.793 [2024-11-20 15:10:17.326363] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:38.793 [2024-11-20 15:10:17.326884] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:38.793 [2024-11-20 15:10:17.327150] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:08:38.793 [2024-11-20 15:10:17.327650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.793 [2024-11-20 15:10:17.327681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.793 [2024-11-20 15:10:17.327775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.327789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.794 [2024-11-20 15:10:17.327881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.327896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.794 [2024-11-20 15:10:17.327992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.328007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.794 #12 NEW cov: 12516 ft: 14615 corp: 10/180b lim: 30 exec/s: 0 rss: 74Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:38.794 [2024-11-20 15:10:17.396867] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:08:38.794 [2024-11-20 15:10:17.397379] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:08:38.794 [2024-11-20 15:10:17.397857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.397884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.794 [2024-11-20 15:10:17.397969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.397985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.794 [2024-11-20 15:10:17.398073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.398088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.794 [2024-11-20 15:10:17.398178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.398192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.794 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:38.794 #13 NEW cov: 12539 ft: 14709 corp: 11/209b lim: 30 exec/s: 0 rss: 74Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:38.794 [2024-11-20 15:10:17.466979] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (3840) > len (4) 00:08:38.794 [2024-11-20 15:10:17.468010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.468038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.794 [2024-11-20 15:10:17.468129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.468146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.794 [2024-11-20 15:10:17.468243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.794 [2024-11-20 15:10:17.468258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.052 #14 NEW cov: 12545 ft: 14760 corp: 12/229b lim: 30 exec/s: 0 rss: 74Mb L: 20/29 MS: 1 ShuffleBytes- 00:08:39.052 [2024-11-20 15:10:17.537363] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.052 [2024-11-20 15:10:17.538324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f0000a2 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.538352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.052 [2024-11-20 15:10:17.538445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.538461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.052 [2024-11-20 15:10:17.538550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.538565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.052 #15 NEW cov: 12545 ft: 14789 corp: 13/250b lim: 30 exec/s: 15 rss: 74Mb L: 21/29 MS: 1 ShuffleBytes- 00:08:39.052 [2024-11-20 15:10:17.607814] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.052 [2024-11-20 15:10:17.608342] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:39.052 [2024-11-20 15:10:17.608844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.608873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.052 [2024-11-20 15:10:17.608975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.608991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.052 [2024-11-20 15:10:17.609084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.609102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.052 #16 NEW cov: 12545 ft: 14834 corp: 14/270b lim: 30 exec/s: 16 rss: 74Mb L: 20/29 MS: 1 ShuffleBytes- 00:08:39.052 [2024-11-20 15:10:17.658277] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.052 [2024-11-20 15:10:17.658575] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (256) > len (4) 00:08:39.052 [2024-11-20 15:10:17.659299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.659332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.052 [2024-11-20 15:10:17.659422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.659437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.052 [2024-11-20 15:10:17.659532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.659550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.052 #17 NEW cov: 12545 ft: 14888 corp: 15/290b lim: 30 exec/s: 17 rss: 74Mb L: 20/29 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:39.052 [2024-11-20 15:10:17.728738] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.052 [2024-11-20 15:10:17.729281] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:08:39.052 [2024-11-20 15:10:17.729787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f0000a2 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.729816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.052 [2024-11-20 15:10:17.729907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.729923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.052 [2024-11-20 15:10:17.730014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.052 [2024-11-20 15:10:17.730030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.311 #18 NEW cov: 12545 ft: 14895 corp: 16/312b lim: 30 exec/s: 18 rss: 74Mb L: 22/29 MS: 1 InsertByte- 00:08:39.311 [2024-11-20 15:10:17.779061] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.311 [2024-11-20 15:10:17.779579] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:39.311 [2024-11-20 15:10:17.780053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.780081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 [2024-11-20 15:10:17.780175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.780191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.311 [2024-11-20 15:10:17.780287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.780302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.311 #19 NEW cov: 12545 ft: 14963 corp: 17/332b lim: 30 exec/s: 19 rss: 74Mb L: 20/29 MS: 1 ChangeBit- 00:08:39.311 [2024-11-20 15:10:17.839519] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.311 [2024-11-20 15:10:17.840052] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:39.311 [2024-11-20 15:10:17.840305] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:08:39.311 [2024-11-20 15:10:17.840770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.840800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 [2024-11-20 15:10:17.840885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.840901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.311 [2024-11-20 15:10:17.840997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000008 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.841011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.311 [2024-11-20 15:10:17.841102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.841117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.311 #20 NEW cov: 12545 ft: 14976 corp: 18/356b lim: 30 exec/s: 20 rss: 74Mb L: 24/29 MS: 1 ChangeBit- 00:08:39.311 [2024-11-20 15:10:17.909742] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.311 [2024-11-20 15:10:17.910016] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:08:39.311 [2024-11-20 15:10:17.910773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.910802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 [2024-11-20 15:10:17.910890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.910907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.311 [2024-11-20 15:10:17.910997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.911013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.311 #21 NEW cov: 12545 ft: 14995 corp: 19/376b lim: 30 exec/s: 21 rss: 74Mb L: 20/29 MS: 1 ChangeByte- 00:08:39.311 [2024-11-20 15:10:17.959721] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.311 [2024-11-20 15:10:17.960455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.960481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 [2024-11-20 15:10:17.960566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.311 [2024-11-20 15:10:17.960581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.311 #22 NEW cov: 12545 ft: 15014 corp: 20/389b lim: 30 exec/s: 22 rss: 74Mb L: 13/29 MS: 1 EraseBytes- 00:08:39.570 [2024-11-20 15:10:18.010209] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.570 [2024-11-20 15:10:18.010485] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (5124) > buf size (4096) 00:08:39.570 [2024-11-20 15:10:18.011187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.011214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.011304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:05000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.011326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.011422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.011438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.570 #23 NEW cov: 12545 ft: 15031 corp: 21/409b lim: 30 exec/s: 23 rss: 74Mb L: 20/29 MS: 1 ChangeBinInt- 00:08:39.570 [2024-11-20 15:10:18.060809] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:08:39.570 [2024-11-20 15:10:18.061359] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:08:39.570 [2024-11-20 15:10:18.061603] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:08:39.570 [2024-11-20 15:10:18.062100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.062128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.062216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.062232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.062324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.062356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.062445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.062460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.062552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.062567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.570 #24 NEW cov: 12545 ft: 15151 corp: 22/439b lim: 30 exec/s: 24 rss: 74Mb L: 30/30 MS: 1 CopyPart- 00:08:39.570 [2024-11-20 15:10:18.130718] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (539652) > buf size (4096) 00:08:39.570 [2024-11-20 15:10:18.131465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.131491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.131584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.131600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.570 #25 NEW cov: 12545 ft: 15157 corp: 23/453b lim: 30 exec/s: 25 rss: 74Mb L: 14/30 MS: 1 CMP- DE: ">\000\000\000\000\000\000\000"- 00:08:39.570 [2024-11-20 15:10:18.181114] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.570 [2024-11-20 15:10:18.181646] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:39.570 [2024-11-20 15:10:18.181913] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000000a 00:08:39.570 [2024-11-20 15:10:18.182423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.182455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.182555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.182571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.570 [2024-11-20 15:10:18.182666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.570 [2024-11-20 15:10:18.182683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.571 [2024-11-20 15:10:18.182776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00008301 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.571 [2024-11-20 15:10:18.182790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.571 #26 NEW cov: 12545 ft: 15178 corp: 24/477b lim: 30 exec/s: 26 rss: 74Mb L: 24/30 MS: 1 ChangeBinInt- 00:08:39.571 [2024-11-20 15:10:18.231245] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.571 [2024-11-20 15:10:18.232239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f0000a2 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.571 [2024-11-20 15:10:18.232265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.571 [2024-11-20 15:10:18.232361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.571 [2024-11-20 15:10:18.232377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.571 [2024-11-20 15:10:18.232487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.571 [2024-11-20 15:10:18.232502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.571 #27 NEW cov: 12545 ft: 15181 corp: 25/498b lim: 30 exec/s: 27 rss: 74Mb L: 21/30 MS: 1 ShuffleBytes- 00:08:39.829 [2024-11-20 15:10:18.281633] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.829 [2024-11-20 15:10:18.282155] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:08:39.829 [2024-11-20 15:10:18.282666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f0000a2 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.282696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.829 [2024-11-20 15:10:18.282792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.282809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.829 [2024-11-20 15:10:18.282900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.282915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.829 #28 NEW cov: 12545 ft: 15194 corp: 26/516b lim: 30 exec/s: 28 rss: 75Mb L: 18/30 MS: 1 EraseBytes- 00:08:39.829 [2024-11-20 15:10:18.351774] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (15364) > buf size (4096) 00:08:39.829 [2024-11-20 15:10:18.352287] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:08:39.829 [2024-11-20 15:10:18.352783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.352811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.829 [2024-11-20 15:10:18.352897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.352912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.829 [2024-11-20 15:10:18.352996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.353011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.829 #29 NEW cov: 12545 ft: 15213 corp: 27/534b lim: 30 exec/s: 29 rss: 75Mb L: 18/30 MS: 1 EraseBytes- 00:08:39.829 [2024-11-20 15:10:18.422051] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:08:39.829 [2024-11-20 15:10:18.422571] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:08:39.829 [2024-11-20 15:10:18.423052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.423079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.829 [2024-11-20 15:10:18.423165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.423180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.829 [2024-11-20 15:10:18.423270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.423286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.829 #30 NEW cov: 12545 ft: 15221 corp: 28/554b lim: 30 exec/s: 30 rss: 75Mb L: 20/30 MS: 1 ShuffleBytes- 00:08:39.829 [2024-11-20 15:10:18.472282] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (7936) > len (4) 00:08:39.829 [2024-11-20 15:10:18.473240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.473265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.829 [2024-11-20 15:10:18.473359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.473375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.829 [2024-11-20 15:10:18.473467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.829 [2024-11-20 15:10:18.473482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.088 #31 NEW cov: 12545 ft: 15235 corp: 29/574b lim: 30 exec/s: 31 rss: 75Mb L: 20/30 MS: 1 ChangeBit- 00:08:40.088 [2024-11-20 15:10:18.542538] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (7936) > len (4) 00:08:40.088 [2024-11-20 15:10:18.543046] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f3f3 00:08:40.088 [2024-11-20 15:10:18.543299] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1036240) > buf size (4096) 00:08:40.088 [2024-11-20 15:10:18.543777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.088 [2024-11-20 15:10:18.543805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.088 [2024-11-20 15:10:18.543894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.088 [2024-11-20 15:10:18.543909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.088 [2024-11-20 15:10:18.544004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:000083f3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.088 [2024-11-20 15:10:18.544020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.088 [2024-11-20 15:10:18.544109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f3f383f3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.088 [2024-11-20 15:10:18.544125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.088 #32 pulse cov: 12545 ft: 15235 corp: 29/574b lim: 30 exec/s: 16 rss: 75Mb 00:08:40.088 [2024-11-20 15:10:18.612872] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (7936) > len (4) 00:08:40.088 [2024-11-20 15:10:18.613364] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f3f3 00:08:40.088 [2024-11-20 15:10:18.613638] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1036240) > buf size (4096) 00:08:40.088 [2024-11-20 15:10:18.614123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.088 [2024-11-20 15:10:18.614150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.088 [2024-11-20 15:10:18.614241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.088 [2024-11-20 15:10:18.614255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.088 [2024-11-20 15:10:18.614345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:000083f3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.088 [2024-11-20 15:10:18.614359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.088 [2024-11-20 15:10:18.614446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f3f38340 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.088 [2024-11-20 15:10:18.614461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.088 #33 NEW cov: 12545 ft: 15308 corp: 30/603b lim: 30 exec/s: 16 rss: 75Mb L: 29/30 MS: 2 InsertRepeatedBytes-InsertByte- 00:08:40.088 #33 DONE cov: 12545 ft: 15308 corp: 30/603b lim: 30 exec/s: 16 rss: 75Mb 00:08:40.088 ###### Recommended dictionary. ###### 00:08:40.088 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:40.088 ">\000\000\000\000\000\000\000" # Uses: 0 00:08:40.088 ###### End of recommended dictionary. ###### 00:08:40.088 Done 33 runs in 2 second(s) 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:40.088 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:40.089 15:10:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:08:40.346 [2024-11-20 15:10:18.776031] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:40.347 [2024-11-20 15:10:18.776099] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1474076 ] 00:08:40.347 [2024-11-20 15:10:18.978973] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.347 [2024-11-20 15:10:18.994942] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.610 [2024-11-20 15:10:19.048109] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:40.610 [2024-11-20 15:10:19.064359] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:08:40.610 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.610 INFO: Seed: 3079957416 00:08:40.610 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:40.610 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:40.610 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:40.610 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.610 #2 INITED exec/s: 0 rss: 66Mb 00:08:40.610 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.610 This may also happen if the target rejected all inputs we tried so far 00:08:40.610 [2024-11-20 15:10:19.131812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:20002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.610 [2024-11-20 15:10:19.131859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.869 NEW_FUNC[1/715]: 0x45c9f8 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:08:40.869 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.869 #23 NEW cov: 12218 ft: 12213 corp: 2/8b lim: 35 exec/s: 0 rss: 73Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:08:40.869 [2024-11-20 15:10:19.472534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.869 [2024-11-20 15:10:19.472579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.870 [2024-11-20 15:10:19.472668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.870 [2024-11-20 15:10:19.472688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.870 #24 NEW cov: 12341 ft: 13200 corp: 3/26b lim: 35 exec/s: 0 rss: 74Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:08:40.870 [2024-11-20 15:10:19.542876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.870 [2024-11-20 15:10:19.542905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.870 [2024-11-20 15:10:19.542993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c0c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.870 [2024-11-20 15:10:19.543008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.128 #25 NEW cov: 12347 ft: 13371 corp: 4/44b lim: 35 exec/s: 0 rss: 74Mb L: 18/18 MS: 1 ChangeByte- 00:08:41.128 [2024-11-20 15:10:19.613209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.128 [2024-11-20 15:10:19.613236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.128 [2024-11-20 15:10:19.613327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.128 [2024-11-20 15:10:19.613343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.128 #26 NEW cov: 12432 ft: 13644 corp: 5/62b lim: 35 exec/s: 0 rss: 74Mb L: 18/18 MS: 1 CopyPart- 00:08:41.128 [2024-11-20 15:10:19.662991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:20002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.128 [2024-11-20 15:10:19.663019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.128 #31 NEW cov: 12432 ft: 13844 corp: 6/74b lim: 35 exec/s: 0 rss: 74Mb L: 12/18 MS: 5 CrossOver-CrossOver-ShuffleBytes-ShuffleBytes-CopyPart- 00:08:41.128 [2024-11-20 15:10:19.713613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.128 [2024-11-20 15:10:19.713640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.128 [2024-11-20 15:10:19.713723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:989c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.128 [2024-11-20 15:10:19.713737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.128 #32 NEW cov: 12432 ft: 13890 corp: 7/92b lim: 35 exec/s: 0 rss: 74Mb L: 18/18 MS: 1 ChangeBit- 00:08:41.128 [2024-11-20 15:10:19.763813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.128 [2024-11-20 15:10:19.763841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.128 [2024-11-20 15:10:19.763926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.128 [2024-11-20 15:10:19.763941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.128 #33 NEW cov: 12432 ft: 13964 corp: 8/110b lim: 35 exec/s: 0 rss: 74Mb L: 18/18 MS: 1 ShuffleBytes- 00:08:41.387 [2024-11-20 15:10:19.834828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.834856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.387 [2024-11-20 15:10:19.834944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.834960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.387 [2024-11-20 15:10:19.835047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:9c9c0020 cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.835061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.387 [2024-11-20 15:10:19.835150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c009c cdw11:20009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.835165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.387 #34 NEW cov: 12432 ft: 14539 corp: 9/142b lim: 35 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:41.387 [2024-11-20 15:10:19.885074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.885103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.387 [2024-11-20 15:10:19.885194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c0c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.885212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.387 [2024-11-20 15:10:19.885305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:9c20009c cdw11:20002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.885332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.387 [2024-11-20 15:10:19.885418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c009c cdw11:9c009c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.885433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.387 #35 NEW cov: 12432 ft: 14564 corp: 10/176b lim: 35 exec/s: 0 rss: 74Mb L: 34/34 MS: 1 CopyPart- 00:08:41.387 [2024-11-20 15:10:19.954303] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:41.387 [2024-11-20 15:10:19.954775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2020009a cdw11:00002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.954805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.387 [2024-11-20 15:10:19.954899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:48000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:19.954917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.387 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:41.387 #39 NEW cov: 12466 ft: 14658 corp: 11/190b lim: 35 exec/s: 0 rss: 74Mb L: 14/34 MS: 4 CrossOver-CopyPart-InsertByte-CMP- DE: "\000\000\000\000\000\000\000H"- 00:08:41.387 [2024-11-20 15:10:20.025035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:20.025070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.387 [2024-11-20 15:10:20.025180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-11-20 15:10:20.025197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.387 #44 NEW cov: 12466 ft: 14761 corp: 12/204b lim: 35 exec/s: 0 rss: 74Mb L: 14/34 MS: 5 ChangeBit-CopyPart-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:41.646 [2024-11-20 15:10:20.085253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.085288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.646 [2024-11-20 15:10:20.085378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c0c009c cdw11:9c00239c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.085394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.646 #45 NEW cov: 12466 ft: 14787 corp: 13/223b lim: 35 exec/s: 45 rss: 74Mb L: 19/34 MS: 1 InsertByte- 00:08:41.646 [2024-11-20 15:10:20.145322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:2000001f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.145356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.646 #46 NEW cov: 12466 ft: 14897 corp: 14/230b lim: 35 exec/s: 46 rss: 74Mb L: 7/34 MS: 1 CMP- DE: "\000\037"- 00:08:41.646 [2024-11-20 15:10:20.195906] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:41.646 [2024-11-20 15:10:20.196643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.196674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.646 [2024-11-20 15:10:20.196762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:2020009c cdw11:00002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.196778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.646 [2024-11-20 15:10:20.196870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:48000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.196888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.646 [2024-11-20 15:10:20.196970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.196985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.646 #47 NEW cov: 12466 ft: 14944 corp: 15/261b lim: 35 exec/s: 47 rss: 74Mb L: 31/34 MS: 1 CrossOver- 00:08:41.646 [2024-11-20 15:10:20.266487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.266516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.646 [2024-11-20 15:10:20.266604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-11-20 15:10:20.266620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.646 #48 NEW cov: 12466 ft: 14975 corp: 16/275b lim: 35 exec/s: 48 rss: 74Mb L: 14/34 MS: 1 ChangeBit- 00:08:41.904 [2024-11-20 15:10:20.336948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.336977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.337080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:9a009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.337097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.904 #49 NEW cov: 12466 ft: 14989 corp: 17/294b lim: 35 exec/s: 49 rss: 74Mb L: 19/34 MS: 1 InsertByte- 00:08:41.904 [2024-11-20 15:10:20.388338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.388368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.388455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c0c009c cdw11:9c009c0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.388474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.388562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:9c9c009c cdw11:20002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.388585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.388686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c0020 cdw11:0c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.388703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.388793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:9c9c009c cdw11:20009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.388809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:41.904 #50 NEW cov: 12466 ft: 15062 corp: 18/329b lim: 35 exec/s: 50 rss: 74Mb L: 35/35 MS: 1 CrossOver- 00:08:41.904 [2024-11-20 15:10:20.467803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.467831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.467921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c003d cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.467937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.904 #51 NEW cov: 12466 ft: 15090 corp: 19/349b lim: 35 exec/s: 51 rss: 74Mb L: 20/35 MS: 1 InsertByte- 00:08:41.904 [2024-11-20 15:10:20.539209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.539241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.539345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c9c009c cdw11:9c009c0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.539367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.539490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:9c9c009c cdw11:20002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.539508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.539600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c0020 cdw11:0c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.539616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.904 [2024-11-20 15:10:20.539702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:9c9c009c cdw11:20009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.904 [2024-11-20 15:10:20.539720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:41.904 #52 NEW cov: 12466 ft: 15140 corp: 20/384b lim: 35 exec/s: 52 rss: 75Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:42.163 [2024-11-20 15:10:20.619130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:9c9c0020 cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.619158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.619280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:20002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.619296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.619393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:9c9c0020 cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.619408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.619505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c009c cdw11:20009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.619528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.163 #53 NEW cov: 12466 ft: 15193 corp: 21/412b lim: 35 exec/s: 53 rss: 75Mb L: 28/35 MS: 1 CopyPart- 00:08:42.163 [2024-11-20 15:10:20.669513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.669541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.669633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9a009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.669648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.669740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0a9c0020 cdw11:9c009c3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.669756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.669847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c009c cdw11:9c009a9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.669866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.163 #54 NEW cov: 12466 ft: 15200 corp: 22/443b lim: 35 exec/s: 54 rss: 75Mb L: 31/35 MS: 1 CopyPart- 00:08:42.163 [2024-11-20 15:10:20.739074] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:42.163 [2024-11-20 15:10:20.739578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.739608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.739707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:2020009c cdw11:00002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.739723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.739813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:48000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.739829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.163 #55 NEW cov: 12466 ft: 15379 corp: 23/468b lim: 35 exec/s: 55 rss: 75Mb L: 25/35 MS: 1 CrossOver- 00:08:42.163 [2024-11-20 15:10:20.790471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.790498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.790607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c9c009c cdw11:9c009c0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.790622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.790729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:9c9c009c cdw11:20002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.790744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.790832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c0020 cdw11:0c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.790848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.163 [2024-11-20 15:10:20.790932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:9c9c009c cdw11:7e009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.163 [2024-11-20 15:10:20.790946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:42.163 #56 NEW cov: 12466 ft: 15387 corp: 24/503b lim: 35 exec/s: 56 rss: 75Mb L: 35/35 MS: 1 ChangeByte- 00:08:42.422 [2024-11-20 15:10:20.860170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.860197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:20.860279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.860296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.422 #57 NEW cov: 12466 ft: 15409 corp: 25/517b lim: 35 exec/s: 57 rss: 75Mb L: 14/35 MS: 1 ShuffleBytes- 00:08:42.422 [2024-11-20 15:10:20.930913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.930940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:20.931049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.931066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:20.931164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:9c20009c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.931179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.422 #58 NEW cov: 12466 ft: 15421 corp: 26/543b lim: 35 exec/s: 58 rss: 75Mb L: 26/35 MS: 1 InsertRepeatedBytes- 00:08:42.422 [2024-11-20 15:10:20.980807] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:42.422 [2024-11-20 15:10:20.981516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:9c9c0020 cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.981544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:20.981634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:00002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.981650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:20.981745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:9c000048 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.981762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:20.981848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c009c cdw11:20009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:20.981863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.422 #59 NEW cov: 12466 ft: 15460 corp: 27/571b lim: 35 exec/s: 59 rss: 75Mb L: 28/35 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000H"- 00:08:42.422 [2024-11-20 15:10:21.052022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:9c002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:21.052051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:21.052159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9c9c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:21.052175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:21.052267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:9c9c0020 cdw11:1f009c00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:21.052283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.422 [2024-11-20 15:10:21.052378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:9c9c009c cdw11:9c009c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.422 [2024-11-20 15:10:21.052397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.422 #60 NEW cov: 12466 ft: 15466 corp: 28/605b lim: 35 exec/s: 60 rss: 75Mb L: 34/35 MS: 1 PersAutoDict- DE: "\000\037"- 00:08:42.681 [2024-11-20 15:10:21.121354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:20002020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.681 [2024-11-20 15:10:21.121383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.681 #61 NEW cov: 12466 ft: 15473 corp: 29/612b lim: 35 exec/s: 30 rss: 75Mb L: 7/35 MS: 1 ShuffleBytes- 00:08:42.681 #61 DONE cov: 12466 ft: 15473 corp: 29/612b lim: 35 exec/s: 30 rss: 75Mb 00:08:42.681 ###### Recommended dictionary. ###### 00:08:42.681 "\000\000\000\000\000\000\000H" # Uses: 1 00:08:42.681 "\000\037" # Uses: 1 00:08:42.681 ###### End of recommended dictionary. ###### 00:08:42.681 Done 61 runs in 2 second(s) 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:42.681 15:10:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:08:42.681 [2024-11-20 15:10:21.279436] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:42.681 [2024-11-20 15:10:21.279505] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1474432 ] 00:08:42.940 [2024-11-20 15:10:21.477373] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.940 [2024-11-20 15:10:21.492322] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.940 [2024-11-20 15:10:21.545018] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:42.940 [2024-11-20 15:10:21.561252] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:08:42.940 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.940 INFO: Seed: 1281986935 00:08:42.940 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:42.940 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:42.940 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:42.940 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.940 #2 INITED exec/s: 0 rss: 66Mb 00:08:42.940 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.940 This may also happen if the target rejected all inputs we tried so far 00:08:43.456 NEW_FUNC[1/704]: 0x45e6d8 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:08:43.456 NEW_FUNC[2/704]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.456 #3 NEW cov: 12136 ft: 12135 corp: 2/12b lim: 20 exec/s: 0 rss: 73Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:08:43.456 #6 NEW cov: 12253 ft: 13065 corp: 3/24b lim: 20 exec/s: 0 rss: 74Mb L: 12/12 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:43.456 #7 NEW cov: 12259 ft: 13288 corp: 4/38b lim: 20 exec/s: 0 rss: 74Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:08:43.456 #8 NEW cov: 12344 ft: 13574 corp: 5/49b lim: 20 exec/s: 0 rss: 74Mb L: 11/14 MS: 1 ChangeBinInt- 00:08:43.456 #9 NEW cov: 12344 ft: 13666 corp: 6/63b lim: 20 exec/s: 0 rss: 74Mb L: 14/14 MS: 1 ChangeBinInt- 00:08:43.714 #10 NEW cov: 12360 ft: 13936 corp: 7/83b lim: 20 exec/s: 0 rss: 74Mb L: 20/20 MS: 1 CrossOver- 00:08:43.714 #11 NEW cov: 12360 ft: 14053 corp: 8/97b lim: 20 exec/s: 0 rss: 74Mb L: 14/20 MS: 1 CopyPart- 00:08:43.714 #12 NEW cov: 12360 ft: 14110 corp: 9/109b lim: 20 exec/s: 0 rss: 74Mb L: 12/20 MS: 1 InsertByte- 00:08:43.714 #13 NEW cov: 12360 ft: 14193 corp: 10/120b lim: 20 exec/s: 0 rss: 74Mb L: 11/20 MS: 1 ChangeBinInt- 00:08:43.972 #14 NEW cov: 12360 ft: 14255 corp: 11/134b lim: 20 exec/s: 0 rss: 74Mb L: 14/20 MS: 1 ChangeBit- 00:08:43.972 #15 NEW cov: 12360 ft: 14296 corp: 12/145b lim: 20 exec/s: 0 rss: 74Mb L: 11/20 MS: 1 ChangeBinInt- 00:08:43.972 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:43.972 #16 NEW cov: 12383 ft: 14345 corp: 13/156b lim: 20 exec/s: 0 rss: 74Mb L: 11/20 MS: 1 EraseBytes- 00:08:43.972 #17 NEW cov: 12383 ft: 14368 corp: 14/167b lim: 20 exec/s: 0 rss: 74Mb L: 11/20 MS: 1 ShuffleBytes- 00:08:43.972 #18 NEW cov: 12383 ft: 14411 corp: 15/182b lim: 20 exec/s: 18 rss: 74Mb L: 15/20 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:43.972 #19 NEW cov: 12383 ft: 14422 corp: 16/196b lim: 20 exec/s: 19 rss: 74Mb L: 14/20 MS: 1 CopyPart- 00:08:44.230 #20 NEW cov: 12383 ft: 14441 corp: 17/210b lim: 20 exec/s: 20 rss: 74Mb L: 14/20 MS: 1 ChangeBinInt- 00:08:44.230 #21 NEW cov: 12383 ft: 14444 corp: 18/224b lim: 20 exec/s: 21 rss: 74Mb L: 14/20 MS: 1 ShuffleBytes- 00:08:44.230 #22 NEW cov: 12383 ft: 14462 corp: 19/235b lim: 20 exec/s: 22 rss: 74Mb L: 11/20 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:44.230 #23 NEW cov: 12383 ft: 14501 corp: 20/255b lim: 20 exec/s: 23 rss: 74Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:08:44.230 #24 NEW cov: 12383 ft: 14765 corp: 21/261b lim: 20 exec/s: 24 rss: 75Mb L: 6/20 MS: 1 EraseBytes- 00:08:44.487 #25 NEW cov: 12383 ft: 14789 corp: 22/273b lim: 20 exec/s: 25 rss: 75Mb L: 12/20 MS: 1 ChangeBinInt- 00:08:44.487 #26 NEW cov: 12383 ft: 14813 corp: 23/293b lim: 20 exec/s: 26 rss: 75Mb L: 20/20 MS: 1 ChangeByte- 00:08:44.487 #27 NEW cov: 12383 ft: 14828 corp: 24/308b lim: 20 exec/s: 27 rss: 75Mb L: 15/20 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:44.487 #28 NEW cov: 12384 ft: 14852 corp: 25/325b lim: 20 exec/s: 28 rss: 75Mb L: 17/20 MS: 1 CopyPart- 00:08:44.487 #29 NEW cov: 12384 ft: 14856 corp: 26/343b lim: 20 exec/s: 29 rss: 75Mb L: 18/20 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:44.745 #30 NEW cov: 12384 ft: 14868 corp: 27/357b lim: 20 exec/s: 30 rss: 75Mb L: 14/20 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:44.745 #31 NEW cov: 12384 ft: 14880 corp: 28/371b lim: 20 exec/s: 31 rss: 75Mb L: 14/20 MS: 1 ChangeBit- 00:08:44.745 #32 NEW cov: 12384 ft: 14891 corp: 29/377b lim: 20 exec/s: 32 rss: 75Mb L: 6/20 MS: 1 ChangeByte- 00:08:44.745 #33 NEW cov: 12384 ft: 14907 corp: 30/391b lim: 20 exec/s: 33 rss: 75Mb L: 14/20 MS: 1 CopyPart- 00:08:44.745 #34 NEW cov: 12384 ft: 14930 corp: 31/409b lim: 20 exec/s: 34 rss: 75Mb L: 18/20 MS: 1 ShuffleBytes- 00:08:45.004 #35 NEW cov: 12384 ft: 14940 corp: 32/423b lim: 20 exec/s: 35 rss: 75Mb L: 14/20 MS: 1 ChangeBinInt- 00:08:45.004 #36 NEW cov: 12384 ft: 14986 corp: 33/442b lim: 20 exec/s: 36 rss: 75Mb L: 19/20 MS: 1 CrossOver- 00:08:45.004 #37 NEW cov: 12384 ft: 15038 corp: 34/453b lim: 20 exec/s: 37 rss: 75Mb L: 11/20 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:45.004 #38 NEW cov: 12384 ft: 15046 corp: 35/464b lim: 20 exec/s: 19 rss: 75Mb L: 11/20 MS: 1 ChangeBinInt- 00:08:45.004 #38 DONE cov: 12384 ft: 15046 corp: 35/464b lim: 20 exec/s: 19 rss: 75Mb 00:08:45.004 ###### Recommended dictionary. ###### 00:08:45.004 "\377\377\377\377" # Uses: 5 00:08:45.004 ###### End of recommended dictionary. ###### 00:08:45.004 Done 38 runs in 2 second(s) 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:45.262 15:10:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:45.262 [2024-11-20 15:10:23.751630] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:45.262 [2024-11-20 15:10:23.751698] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1474783 ] 00:08:45.521 [2024-11-20 15:10:23.952241] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.521 [2024-11-20 15:10:23.966877] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.521 [2024-11-20 15:10:24.019592] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.521 [2024-11-20 15:10:24.035833] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:45.521 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.521 INFO: Seed: 3756965369 00:08:45.521 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:45.521 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:45.521 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:45.521 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.521 #2 INITED exec/s: 0 rss: 66Mb 00:08:45.521 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.521 This may also happen if the target rejected all inputs we tried so far 00:08:45.521 [2024-11-20 15:10:24.091253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.521 [2024-11-20 15:10:24.091284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:45.779 NEW_FUNC[1/716]: 0x45f7d8 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:45.779 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:45.779 #18 NEW cov: 12249 ft: 12244 corp: 2/10b lim: 35 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:45.779 [2024-11-20 15:10:24.432170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b7300a5d cdw11:ee950000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.779 [2024-11-20 15:10:24.432209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:45.779 #19 NEW cov: 12362 ft: 12854 corp: 3/19b lim: 35 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CMP- DE: "]\2670\356\225$E\000"- 00:08:46.037 [2024-11-20 15:10:24.472535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.472563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.037 [2024-11-20 15:10:24.472620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.472635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.037 [2024-11-20 15:10:24.472688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff00ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.472704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:46.037 #20 NEW cov: 12368 ft: 13831 corp: 4/41b lim: 35 exec/s: 0 rss: 73Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:46.037 [2024-11-20 15:10:24.532370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.532398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.037 #21 NEW cov: 12453 ft: 14246 corp: 5/50b lim: 35 exec/s: 0 rss: 73Mb L: 9/22 MS: 1 ShuffleBytes- 00:08:46.037 [2024-11-20 15:10:24.572787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0a00 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.572813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.037 [2024-11-20 15:10:24.572868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.572883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.037 [2024-11-20 15:10:24.572941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff00ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.572958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:46.037 #22 NEW cov: 12453 ft: 14343 corp: 6/72b lim: 35 exec/s: 0 rss: 73Mb L: 22/22 MS: 1 ShuffleBytes- 00:08:46.037 [2024-11-20 15:10:24.632785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.632813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.037 [2024-11-20 15:10:24.632869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.632884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.037 #29 NEW cov: 12453 ft: 14649 corp: 7/91b lim: 35 exec/s: 0 rss: 74Mb L: 19/22 MS: 2 ChangeByte-CrossOver- 00:08:46.037 [2024-11-20 15:10:24.672724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.037 [2024-11-20 15:10:24.672751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.037 #30 NEW cov: 12453 ft: 14743 corp: 8/104b lim: 35 exec/s: 0 rss: 74Mb L: 13/22 MS: 1 EraseBytes- 00:08:46.295 [2024-11-20 15:10:24.732895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:40000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.295 [2024-11-20 15:10:24.732921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.295 #31 NEW cov: 12453 ft: 14813 corp: 9/113b lim: 35 exec/s: 0 rss: 74Mb L: 9/22 MS: 1 ChangeBit- 00:08:46.296 [2024-11-20 15:10:24.793214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b7300a5d cdw11:ee000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.296 [2024-11-20 15:10:24.793240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.296 [2024-11-20 15:10:24.793296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.296 [2024-11-20 15:10:24.793311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.296 #32 NEW cov: 12453 ft: 14857 corp: 10/130b lim: 35 exec/s: 0 rss: 74Mb L: 17/22 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:46.296 [2024-11-20 15:10:24.853549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0a00 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.296 [2024-11-20 15:10:24.853577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.296 [2024-11-20 15:10:24.853628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.296 [2024-11-20 15:10:24.853643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.296 [2024-11-20 15:10:24.853699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff00ffff cdw11:fa000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.296 [2024-11-20 15:10:24.853713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:46.296 #33 NEW cov: 12453 ft: 14891 corp: 11/153b lim: 35 exec/s: 0 rss: 74Mb L: 23/23 MS: 1 InsertByte- 00:08:46.296 [2024-11-20 15:10:24.913624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.296 [2024-11-20 15:10:24.913653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.296 [2024-11-20 15:10:24.913710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.296 [2024-11-20 15:10:24.913724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.296 #34 NEW cov: 12453 ft: 14916 corp: 12/172b lim: 35 exec/s: 0 rss: 74Mb L: 19/23 MS: 1 CrossOver- 00:08:46.296 [2024-11-20 15:10:24.973565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.296 [2024-11-20 15:10:24.973590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.554 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:46.554 #35 NEW cov: 12476 ft: 14974 corp: 13/181b lim: 35 exec/s: 0 rss: 74Mb L: 9/23 MS: 1 ChangeBinInt- 00:08:46.554 [2024-11-20 15:10:25.014048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0a00 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.554 [2024-11-20 15:10:25.014074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.554 [2024-11-20 15:10:25.014130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff01ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.554 [2024-11-20 15:10:25.014145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.554 [2024-11-20 15:10:25.014197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:fa000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.554 [2024-11-20 15:10:25.014211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:46.554 #36 NEW cov: 12476 ft: 14982 corp: 14/204b lim: 35 exec/s: 0 rss: 74Mb L: 23/23 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:46.554 [2024-11-20 15:10:25.073876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b730ad5d cdw11:ee950000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.554 [2024-11-20 15:10:25.073901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.554 #41 NEW cov: 12476 ft: 15013 corp: 15/217b lim: 35 exec/s: 41 rss: 75Mb L: 13/23 MS: 5 InsertByte-ChangeBit-CopyPart-InsertByte-PersAutoDict- DE: "]\2670\356\225$E\000"- 00:08:46.554 [2024-11-20 15:10:25.113975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b7300a5d cdw11:ce950000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.554 [2024-11-20 15:10:25.114000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.554 #42 NEW cov: 12476 ft: 15026 corp: 16/226b lim: 35 exec/s: 42 rss: 75Mb L: 9/23 MS: 1 ChangeBit- 00:08:46.554 [2024-11-20 15:10:25.154261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0a00 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.554 [2024-11-20 15:10:25.154286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.554 [2024-11-20 15:10:25.154343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00fa0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.554 [2024-11-20 15:10:25.154359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.554 #43 NEW cov: 12476 ft: 15062 corp: 17/243b lim: 35 exec/s: 43 rss: 75Mb L: 17/23 MS: 1 EraseBytes- 00:08:46.554 [2024-11-20 15:10:25.194204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b7300a5d cdw11:ce950000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.554 [2024-11-20 15:10:25.194231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.554 #44 NEW cov: 12476 ft: 15093 corp: 18/252b lim: 35 exec/s: 44 rss: 75Mb L: 9/23 MS: 1 ChangeByte- 00:08:46.813 [2024-11-20 15:10:25.254546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.254572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.813 [2024-11-20 15:10:25.254631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff36ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.254646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.813 #45 NEW cov: 12476 ft: 15156 corp: 19/272b lim: 35 exec/s: 45 rss: 75Mb L: 20/23 MS: 1 InsertByte- 00:08:46.813 [2024-11-20 15:10:25.314747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.314774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.813 [2024-11-20 15:10:25.314829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff36ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.314844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.813 #46 NEW cov: 12476 ft: 15157 corp: 20/290b lim: 35 exec/s: 46 rss: 75Mb L: 18/23 MS: 1 EraseBytes- 00:08:46.813 [2024-11-20 15:10:25.374912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.374938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.813 [2024-11-20 15:10:25.374993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fffffffe cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.375008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.813 #47 NEW cov: 12476 ft: 15186 corp: 21/309b lim: 35 exec/s: 47 rss: 75Mb L: 19/23 MS: 1 ChangeBit- 00:08:46.813 [2024-11-20 15:10:25.415157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002605 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.415184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.813 [2024-11-20 15:10:25.415241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.415256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:46.813 [2024-11-20 15:10:25.415312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffff36 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.415332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:46.813 #48 NEW cov: 12476 ft: 15204 corp: 22/335b lim: 35 exec/s: 48 rss: 75Mb L: 26/26 MS: 1 CMP- DE: "\005\000\000\000\000\000\000\000"- 00:08:46.813 [2024-11-20 15:10:25.475011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b7b70a5d cdw11:30ce0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.813 [2024-11-20 15:10:25.475039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.072 #49 NEW cov: 12476 ft: 15225 corp: 23/344b lim: 35 exec/s: 49 rss: 75Mb L: 9/26 MS: 1 CopyPart- 00:08:47.072 [2024-11-20 15:10:25.535300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.535331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.072 #50 NEW cov: 12476 ft: 15233 corp: 24/353b lim: 35 exec/s: 50 rss: 75Mb L: 9/26 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:47.072 [2024-11-20 15:10:25.595782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27000a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.595809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.595864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.595879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.595935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:00fa0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.595949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.072 #51 NEW cov: 12476 ft: 15240 corp: 25/377b lim: 35 exec/s: 51 rss: 75Mb L: 24/26 MS: 1 InsertByte- 00:08:47.072 [2024-11-20 15:10:25.636014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27000a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.636040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.636096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.636111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.636165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffff00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.636180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.636236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.636250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.072 #52 NEW cov: 12476 ft: 15577 corp: 26/409b lim: 35 exec/s: 52 rss: 75Mb L: 32/32 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:47.072 [2024-11-20 15:10:25.696082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a3b cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.696109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.696165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.696181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.696240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.696256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.072 #53 NEW cov: 12476 ft: 15596 corp: 27/432b lim: 35 exec/s: 53 rss: 75Mb L: 23/32 MS: 1 InsertByte- 00:08:47.072 [2024-11-20 15:10:25.736344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27000a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.736371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.736429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.736444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.736497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.736511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.072 [2024-11-20 15:10:25.736563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:00fa0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.072 [2024-11-20 15:10:25.736578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.330 #54 NEW cov: 12476 ft: 15601 corp: 28/464b lim: 35 exec/s: 54 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:08:47.330 [2024-11-20 15:10:25.796042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b7300a5d cdw11:ee950000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.796069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.330 #55 NEW cov: 12476 ft: 15622 corp: 29/473b lim: 35 exec/s: 55 rss: 75Mb L: 9/32 MS: 1 ChangeBit- 00:08:47.330 [2024-11-20 15:10:25.836297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a5d cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.836329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.330 [2024-11-20 15:10:25.836385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffb7ffff cdw11:30ee0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.836400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.330 #56 NEW cov: 12476 ft: 15637 corp: 30/490b lim: 35 exec/s: 56 rss: 75Mb L: 17/32 MS: 1 CrossOver- 00:08:47.330 [2024-11-20 15:10:25.896305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b7300a5d cdw11:ee950000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.896339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.330 #57 NEW cov: 12476 ft: 15659 corp: 31/498b lim: 35 exec/s: 57 rss: 75Mb L: 8/32 MS: 1 EraseBytes- 00:08:47.330 [2024-11-20 15:10:25.936565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5db7ad31 cdw11:30ee0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.936591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.330 [2024-11-20 15:10:25.936649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:000a2445 cdw11:ad7e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.936667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.330 #58 NEW cov: 12476 ft: 15664 corp: 32/512b lim: 35 exec/s: 58 rss: 75Mb L: 14/32 MS: 1 InsertByte- 00:08:47.330 [2024-11-20 15:10:25.996936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.996962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.330 [2024-11-20 15:10:25.997020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a0036ff cdw11:27000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.997035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.330 [2024-11-20 15:10:25.997090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.330 [2024-11-20 15:10:25.997106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.589 #59 NEW cov: 12476 ft: 15667 corp: 33/534b lim: 35 exec/s: 59 rss: 75Mb L: 22/32 MS: 1 CrossOver- 00:08:47.589 [2024-11-20 15:10:26.036829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a11 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.589 [2024-11-20 15:10:26.036855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.589 [2024-11-20 15:10:26.036911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.589 [2024-11-20 15:10:26.036926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.589 #60 NEW cov: 12476 ft: 15683 corp: 34/551b lim: 35 exec/s: 30 rss: 75Mb L: 17/32 MS: 1 ChangeBinInt- 00:08:47.589 #60 DONE cov: 12476 ft: 15683 corp: 34/551b lim: 35 exec/s: 30 rss: 75Mb 00:08:47.589 ###### Recommended dictionary. ###### 00:08:47.589 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:47.589 "]\2670\356\225$E\000" # Uses: 1 00:08:47.589 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:47.589 "\005\000\000\000\000\000\000\000" # Uses: 0 00:08:47.589 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:47.589 ###### End of recommended dictionary. ###### 00:08:47.589 Done 60 runs in 2 second(s) 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:47.589 15:10:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:47.589 [2024-11-20 15:10:26.222445] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:47.589 [2024-11-20 15:10:26.222515] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1475142 ] 00:08:47.848 [2024-11-20 15:10:26.422627] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.848 [2024-11-20 15:10:26.437212] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.848 [2024-11-20 15:10:26.489949] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:47.848 [2024-11-20 15:10:26.506183] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:47.848 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.848 INFO: Seed: 1931993412 00:08:48.106 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:48.106 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:48.106 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:48.106 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.106 #2 INITED exec/s: 0 rss: 66Mb 00:08:48.106 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.106 This may also happen if the target rejected all inputs we tried so far 00:08:48.106 [2024-11-20 15:10:26.561280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.106 [2024-11-20 15:10:26.561323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.106 [2024-11-20 15:10:26.561358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.106 [2024-11-20 15:10:26.561374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.106 [2024-11-20 15:10:26.561403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.106 [2024-11-20 15:10:26.561420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.106 [2024-11-20 15:10:26.561448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.106 [2024-11-20 15:10:26.561464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.365 NEW_FUNC[1/716]: 0x461978 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:48.365 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.365 #7 NEW cov: 12260 ft: 12259 corp: 2/42b lim: 45 exec/s: 0 rss: 73Mb L: 41/41 MS: 5 CopyPart-ShuffleBytes-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:48.365 [2024-11-20 15:10:26.935840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:c0c00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.365 [2024-11-20 15:10:26.935885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.365 [2024-11-20 15:10:26.935984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.365 [2024-11-20 15:10:26.936003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.365 [2024-11-20 15:10:26.936098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.365 [2024-11-20 15:10:26.936117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.365 [2024-11-20 15:10:26.936216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.365 [2024-11-20 15:10:26.936236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.365 #8 NEW cov: 12373 ft: 12983 corp: 3/84b lim: 45 exec/s: 0 rss: 73Mb L: 42/42 MS: 1 CrossOver- 00:08:48.365 [2024-11-20 15:10:27.014795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.365 [2024-11-20 15:10:27.014823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.365 #13 NEW cov: 12379 ft: 14015 corp: 4/95b lim: 45 exec/s: 0 rss: 73Mb L: 11/42 MS: 5 InsertByte-InsertByte-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:48.624 [2024-11-20 15:10:27.076451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:3fc00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.076479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.076571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.076586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.076678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.076694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.076786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.076800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.624 #14 NEW cov: 12464 ft: 14283 corp: 5/136b lim: 45 exec/s: 0 rss: 73Mb L: 41/42 MS: 1 ChangeByte- 00:08:48.624 [2024-11-20 15:10:27.136880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:c0c00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.136907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.136999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.137014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.137104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.137118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.137208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.137224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.624 #15 NEW cov: 12464 ft: 14426 corp: 6/178b lim: 45 exec/s: 0 rss: 73Mb L: 42/42 MS: 1 ShuffleBytes- 00:08:48.624 [2024-11-20 15:10:27.206218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.206246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.624 #16 NEW cov: 12464 ft: 14550 corp: 7/189b lim: 45 exec/s: 0 rss: 73Mb L: 11/42 MS: 1 CopyPart- 00:08:48.624 [2024-11-20 15:10:27.287932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:c0c00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.287961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.288056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.288072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.288155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c8c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.288170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.624 [2024-11-20 15:10:27.288264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.624 [2024-11-20 15:10:27.288278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.883 #17 NEW cov: 12464 ft: 14591 corp: 8/231b lim: 45 exec/s: 0 rss: 74Mb L: 42/42 MS: 1 ChangeBit- 00:08:48.883 [2024-11-20 15:10:27.338304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:c0c00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.338337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.338432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.338448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.338539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c8c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.338556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.338641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.338657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.883 #18 NEW cov: 12464 ft: 14634 corp: 9/273b lim: 45 exec/s: 0 rss: 74Mb L: 42/42 MS: 1 CrossOver- 00:08:48.883 [2024-11-20 15:10:27.408809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:eaeaeaea cdw11:eaea0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.408835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.408924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:eaeaeaea cdw11:eaea0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.408939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.409038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:eaeaeaea cdw11:eaea0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.409053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.409143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.409157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.883 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:48.883 #19 NEW cov: 12481 ft: 14664 corp: 10/311b lim: 45 exec/s: 0 rss: 74Mb L: 38/42 MS: 1 InsertRepeatedBytes- 00:08:48.883 [2024-11-20 15:10:27.479279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:3fc00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.479305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.479406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.479421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.479509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.479524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.479620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.479635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.883 #20 NEW cov: 12481 ft: 14706 corp: 11/352b lim: 45 exec/s: 0 rss: 74Mb L: 41/42 MS: 1 ChangeBit- 00:08:48.883 [2024-11-20 15:10:27.549777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:3fc00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.549804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.549915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.549931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.550033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0101c001 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.550051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.883 [2024-11-20 15:10:27.550149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7ac0010a cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.883 [2024-11-20 15:10:27.550165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.142 #21 NEW cov: 12481 ft: 14746 corp: 12/393b lim: 45 exec/s: 21 rss: 74Mb L: 41/42 MS: 1 CrossOver- 00:08:49.142 [2024-11-20 15:10:27.600044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c0fb40 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.600070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.600158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.600174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.600268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.600282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.600409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.600424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.142 #22 NEW cov: 12481 ft: 14786 corp: 13/435b lim: 45 exec/s: 22 rss: 74Mb L: 42/42 MS: 1 CopyPart- 00:08:49.142 [2024-11-20 15:10:27.649241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.649266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.142 #23 NEW cov: 12481 ft: 14817 corp: 14/446b lim: 45 exec/s: 23 rss: 74Mb L: 11/42 MS: 1 ShuffleBytes- 00:08:49.142 [2024-11-20 15:10:27.700915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:3fc00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.700940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.701035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.701050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.701132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.701145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.701233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.701248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.142 #24 NEW cov: 12481 ft: 14847 corp: 15/488b lim: 45 exec/s: 24 rss: 74Mb L: 42/42 MS: 1 InsertByte- 00:08:49.142 [2024-11-20 15:10:27.771282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:3fc00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.771311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.771418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.771433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.771513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.771528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.142 [2024-11-20 15:10:27.771631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.142 [2024-11-20 15:10:27.771646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.142 #25 NEW cov: 12481 ft: 14863 corp: 16/530b lim: 45 exec/s: 25 rss: 74Mb L: 42/42 MS: 1 ChangeByte- 00:08:49.401 [2024-11-20 15:10:27.841804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c00ac0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.841830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:27.841919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.841935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:27.842032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.842047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:27.842134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.842149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.401 #26 NEW cov: 12481 ft: 14927 corp: 17/572b lim: 45 exec/s: 26 rss: 74Mb L: 42/42 MS: 1 ShuffleBytes- 00:08:49.401 [2024-11-20 15:10:27.891532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:eaeaeaea cdw11:eaea0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.891557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:27.891650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ea01eaea cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.891664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.401 #27 NEW cov: 12481 ft: 15189 corp: 18/592b lim: 45 exec/s: 27 rss: 74Mb L: 20/42 MS: 1 EraseBytes- 00:08:49.401 [2024-11-20 15:10:27.962881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:eaeaeaea cdw11:eaea0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.962906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:27.962995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:eaeaeaea cdw11:eaea0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.963014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:27.963100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:eaeaeaea cdw11:eaea0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.963114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:27.963200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0101ea01 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:27.963216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.401 #28 NEW cov: 12481 ft: 15208 corp: 19/631b lim: 45 exec/s: 28 rss: 74Mb L: 39/42 MS: 1 InsertByte- 00:08:49.401 [2024-11-20 15:10:28.013141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c0b8c0 cdw11:c0c00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:28.013165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:28.013249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:28.013263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:28.013367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:28.013382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:28.013469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:28.013484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.401 #29 NEW cov: 12481 ft: 15252 corp: 20/673b lim: 45 exec/s: 29 rss: 74Mb L: 42/42 MS: 1 ChangeBinInt- 00:08:49.401 [2024-11-20 15:10:28.063582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:c0c00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:28.063607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:28.063700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:28.063714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:28.063820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c8c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:28.063836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.401 [2024-11-20 15:10:28.063935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.401 [2024-11-20 15:10:28.063949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.659 #30 NEW cov: 12481 ft: 15260 corp: 21/715b lim: 45 exec/s: 30 rss: 74Mb L: 42/42 MS: 1 ChangeByte- 00:08:49.659 [2024-11-20 15:10:28.133163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.659 [2024-11-20 15:10:28.133193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.659 [2024-11-20 15:10:28.133288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.659 [2024-11-20 15:10:28.133304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.659 #31 NEW cov: 12481 ft: 15276 corp: 22/733b lim: 45 exec/s: 31 rss: 74Mb L: 18/42 MS: 1 CrossOver- 00:08:49.659 [2024-11-20 15:10:28.183632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.659 [2024-11-20 15:10:28.183657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.659 [2024-11-20 15:10:28.183750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:01060101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.659 [2024-11-20 15:10:28.183764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.659 #32 NEW cov: 12481 ft: 15292 corp: 23/752b lim: 45 exec/s: 32 rss: 74Mb L: 19/42 MS: 1 InsertByte- 00:08:49.659 [2024-11-20 15:10:28.254221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c040c0 cdw11:c0c00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.659 [2024-11-20 15:10:28.254246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.659 [2024-11-20 15:10:28.254337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.659 [2024-11-20 15:10:28.254352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.659 #33 NEW cov: 12481 ft: 15303 corp: 24/777b lim: 45 exec/s: 33 rss: 74Mb L: 25/42 MS: 1 EraseBytes- 00:08:49.659 [2024-11-20 15:10:28.324626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c0b8c0 cdw11:c0c00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.659 [2024-11-20 15:10:28.324651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.659 [2024-11-20 15:10:28.324739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.659 [2024-11-20 15:10:28.324755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.917 #34 NEW cov: 12481 ft: 15315 corp: 25/798b lim: 45 exec/s: 34 rss: 74Mb L: 21/42 MS: 1 CrossOver- 00:08:49.917 [2024-11-20 15:10:28.394758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.917 [2024-11-20 15:10:28.394786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.917 #35 NEW cov: 12481 ft: 15346 corp: 26/809b lim: 45 exec/s: 35 rss: 74Mb L: 11/42 MS: 1 ShuffleBytes- 00:08:49.917 [2024-11-20 15:10:28.476634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:c0c0fb40 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.917 [2024-11-20 15:10:28.476661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.917 [2024-11-20 15:10:28.476748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.917 [2024-11-20 15:10:28.476767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.917 [2024-11-20 15:10:28.476860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:fb400006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.917 [2024-11-20 15:10:28.476874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.917 [2024-11-20 15:10:28.476969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.917 [2024-11-20 15:10:28.476985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.917 [2024-11-20 15:10:28.477074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.917 [2024-11-20 15:10:28.477089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:49.918 #36 NEW cov: 12481 ft: 15415 corp: 27/854b lim: 45 exec/s: 36 rss: 75Mb L: 45/45 MS: 1 CopyPart- 00:08:49.918 [2024-11-20 15:10:28.555312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01010101 cdw11:01010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.918 [2024-11-20 15:10:28.555348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.918 #37 NEW cov: 12481 ft: 15471 corp: 28/865b lim: 45 exec/s: 18 rss: 75Mb L: 11/45 MS: 1 CrossOver- 00:08:49.918 #37 DONE cov: 12481 ft: 15471 corp: 28/865b lim: 45 exec/s: 18 rss: 75Mb 00:08:49.918 Done 37 runs in 2 second(s) 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:50.175 15:10:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:50.175 [2024-11-20 15:10:28.744832] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:50.175 [2024-11-20 15:10:28.744902] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1475498 ] 00:08:50.433 [2024-11-20 15:10:28.961767] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.433 [2024-11-20 15:10:28.976298] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.433 [2024-11-20 15:10:29.029033] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:50.433 [2024-11-20 15:10:29.045272] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:50.433 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.433 INFO: Seed: 175024721 00:08:50.433 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:50.433 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:50.433 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:50.433 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.433 #2 INITED exec/s: 0 rss: 66Mb 00:08:50.433 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.433 This may also happen if the target rejected all inputs we tried so far 00:08:50.433 [2024-11-20 15:10:29.090670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000250a cdw11:00000000 00:08:50.433 [2024-11-20 15:10:29.090699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.946 NEW_FUNC[1/714]: 0x464188 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:50.946 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:50.946 #3 NEW cov: 12176 ft: 12172 corp: 2/3b lim: 10 exec/s: 0 rss: 73Mb L: 2/2 MS: 1 InsertByte- 00:08:50.946 [2024-11-20 15:10:29.431629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007eec cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.431665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.946 [2024-11-20 15:10:29.431717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ecec cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.431732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.946 #6 NEW cov: 12290 ft: 13073 corp: 3/8b lim: 10 exec/s: 0 rss: 73Mb L: 5/5 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:50.946 [2024-11-20 15:10:29.471659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008113 cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.471686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.946 [2024-11-20 15:10:29.471738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001313 cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.471752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.946 #7 NEW cov: 12296 ft: 13260 corp: 4/13b lim: 10 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:50.946 [2024-11-20 15:10:29.531984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.532011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.946 [2024-11-20 15:10:29.532064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.532081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.946 [2024-11-20 15:10:29.532132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.532147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.946 #10 NEW cov: 12381 ft: 13704 corp: 5/19b lim: 10 exec/s: 0 rss: 74Mb L: 6/6 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:50.946 [2024-11-20 15:10:29.572160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000abc cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.572186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.946 [2024-11-20 15:10:29.572237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000bcbc cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.572251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.946 [2024-11-20 15:10:29.572302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000bcbc cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.572320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.946 [2024-11-20 15:10:29.572371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000bcbc cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.572385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.946 #11 NEW cov: 12381 ft: 14057 corp: 6/27b lim: 10 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:50.946 [2024-11-20 15:10:29.611906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ae8 cdw11:00000000 00:08:50.946 [2024-11-20 15:10:29.611933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.204 #15 NEW cov: 12381 ft: 14114 corp: 7/29b lim: 10 exec/s: 0 rss: 74Mb L: 2/8 MS: 4 ChangeBinInt-ChangeBinInt-ChangeBit-CrossOver- 00:08:51.204 [2024-11-20 15:10:29.652492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.204 [2024-11-20 15:10:29.652520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.204 [2024-11-20 15:10:29.652573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.652588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.652643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.652658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.652711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.652724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.652777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000250a cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.652791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:51.205 #16 NEW cov: 12381 ft: 14249 corp: 8/39b lim: 10 exec/s: 0 rss: 74Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:51.205 [2024-11-20 15:10:29.712328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007ee8 cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.712359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.712412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ecec cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.712426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.205 #17 NEW cov: 12381 ft: 14331 corp: 9/44b lim: 10 exec/s: 0 rss: 74Mb L: 5/10 MS: 1 ChangeBit- 00:08:51.205 [2024-11-20 15:10:29.752705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000250a cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.752733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.752789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.752803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.752856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.752872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.752924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.752939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.205 #18 NEW cov: 12381 ft: 14394 corp: 10/52b lim: 10 exec/s: 0 rss: 74Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:08:51.205 [2024-11-20 15:10:29.792699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.792727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.792781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.792795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.205 [2024-11-20 15:10:29.792847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.792861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.205 #19 NEW cov: 12381 ft: 14488 corp: 11/59b lim: 10 exec/s: 0 rss: 74Mb L: 7/10 MS: 1 InsertByte- 00:08:51.205 [2024-11-20 15:10:29.852605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a23 cdw11:00000000 00:08:51.205 [2024-11-20 15:10:29.852632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.205 #21 NEW cov: 12381 ft: 14515 corp: 12/61b lim: 10 exec/s: 0 rss: 74Mb L: 2/10 MS: 2 CopyPart-InsertByte- 00:08:51.463 [2024-11-20 15:10:29.892836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.463 [2024-11-20 15:10:29.892863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.463 [2024-11-20 15:10:29.892918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.463 [2024-11-20 15:10:29.892934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.463 #22 NEW cov: 12381 ft: 14532 corp: 13/65b lim: 10 exec/s: 0 rss: 74Mb L: 4/10 MS: 1 CrossOver- 00:08:51.463 [2024-11-20 15:10:29.953063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000250a cdw11:00000000 00:08:51.463 [2024-11-20 15:10:29.953090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.463 [2024-11-20 15:10:29.953142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.463 [2024-11-20 15:10:29.953156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.463 [2024-11-20 15:10:29.953209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.463 [2024-11-20 15:10:29.953223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.463 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:51.463 #23 NEW cov: 12404 ft: 14564 corp: 14/71b lim: 10 exec/s: 0 rss: 74Mb L: 6/10 MS: 1 EraseBytes- 00:08:51.463 [2024-11-20 15:10:30.026026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.026073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.026169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.026189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.026285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.026306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.026405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.026425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.026518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e80a cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.026537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:51.464 #24 NEW cov: 12404 ft: 14637 corp: 15/81b lim: 10 exec/s: 0 rss: 74Mb L: 10/10 MS: 1 CrossOver- 00:08:51.464 [2024-11-20 15:10:30.085967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000250a cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.085999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.086081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.086097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.086191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.086207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.086301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000fdff cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.086319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.464 #25 NEW cov: 12404 ft: 14729 corp: 16/89b lim: 10 exec/s: 25 rss: 74Mb L: 8/10 MS: 1 ChangeBit- 00:08:51.464 [2024-11-20 15:10:30.135878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.135906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.135998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.136013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.136103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.136118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.464 [2024-11-20 15:10:30.136217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.464 [2024-11-20 15:10:30.136232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.721 #26 NEW cov: 12404 ft: 14740 corp: 17/98b lim: 10 exec/s: 26 rss: 74Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:08:51.721 [2024-11-20 15:10:30.205696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:51.721 [2024-11-20 15:10:30.205722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.721 #31 NEW cov: 12404 ft: 14752 corp: 18/100b lim: 10 exec/s: 31 rss: 74Mb L: 2/10 MS: 5 ShuffleBytes-ShuffleBytes-ChangeBit-ChangeByte-CopyPart- 00:08:51.721 [2024-11-20 15:10:30.256483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.721 [2024-11-20 15:10:30.256521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.721 [2024-11-20 15:10:30.256631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 00:08:51.721 [2024-11-20 15:10:30.256646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.721 [2024-11-20 15:10:30.256731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.256745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.722 #32 NEW cov: 12404 ft: 14772 corp: 19/106b lim: 10 exec/s: 32 rss: 74Mb L: 6/10 MS: 1 ChangeBit- 00:08:51.722 [2024-11-20 15:10:30.306617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.306643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.722 [2024-11-20 15:10:30.306726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.306740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.722 [2024-11-20 15:10:30.306845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.306860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.722 #33 NEW cov: 12404 ft: 14833 corp: 20/113b lim: 10 exec/s: 33 rss: 74Mb L: 7/10 MS: 1 ShuffleBytes- 00:08:51.722 [2024-11-20 15:10:30.377682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.377708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.722 [2024-11-20 15:10:30.377804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.377819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.722 [2024-11-20 15:10:30.377908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.377922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.722 [2024-11-20 15:10:30.378005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000000ec cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.378020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.722 [2024-11-20 15:10:30.378097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.722 [2024-11-20 15:10:30.378112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:51.980 #34 NEW cov: 12404 ft: 14880 corp: 21/123b lim: 10 exec/s: 34 rss: 74Mb L: 10/10 MS: 1 CrossOver- 00:08:51.980 [2024-11-20 15:10:30.447915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000250a cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.447942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.980 [2024-11-20 15:10:30.448025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.448041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.980 [2024-11-20 15:10:30.448125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.448140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.980 [2024-11-20 15:10:30.448224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.448239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.980 [2024-11-20 15:10:30.448323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.448354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:51.980 #35 NEW cov: 12404 ft: 14924 corp: 22/133b lim: 10 exec/s: 35 rss: 74Mb L: 10/10 MS: 1 CrossOver- 00:08:51.980 [2024-11-20 15:10:30.497287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.497318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.980 [2024-11-20 15:10:30.497401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.497417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.980 #36 NEW cov: 12404 ft: 14992 corp: 23/137b lim: 10 exec/s: 36 rss: 74Mb L: 4/10 MS: 1 CopyPart- 00:08:51.980 [2024-11-20 15:10:30.547422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.547447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.980 [2024-11-20 15:10:30.547533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.547547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.980 #37 NEW cov: 12404 ft: 15010 corp: 24/141b lim: 10 exec/s: 37 rss: 74Mb L: 4/10 MS: 1 CopyPart- 00:08:51.980 [2024-11-20 15:10:30.617732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008113 cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.617757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.980 [2024-11-20 15:10:30.617842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000013ec cdw11:00000000 00:08:51.980 [2024-11-20 15:10:30.617856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.980 #38 NEW cov: 12404 ft: 15028 corp: 25/145b lim: 10 exec/s: 38 rss: 74Mb L: 4/10 MS: 1 EraseBytes- 00:08:52.238 [2024-11-20 15:10:30.688136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000500 cdw11:00000000 00:08:52.238 [2024-11-20 15:10:30.688164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.238 [2024-11-20 15:10:30.688268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001313 cdw11:00000000 00:08:52.238 [2024-11-20 15:10:30.688284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:52.238 #39 NEW cov: 12404 ft: 15045 corp: 26/150b lim: 10 exec/s: 39 rss: 74Mb L: 5/10 MS: 1 ChangeBinInt- 00:08:52.238 [2024-11-20 15:10:30.739360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 00:08:52.238 [2024-11-20 15:10:30.739398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.238 [2024-11-20 15:10:30.739499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:52.238 [2024-11-20 15:10:30.739515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:52.238 [2024-11-20 15:10:30.739606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:52.238 [2024-11-20 15:10:30.739621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:52.239 [2024-11-20 15:10:30.739708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000000ec cdw11:00000000 00:08:52.239 [2024-11-20 15:10:30.739723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:52.239 [2024-11-20 15:10:30.739805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:000000ff cdw11:00000000 00:08:52.239 [2024-11-20 15:10:30.739821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:52.239 #40 NEW cov: 12404 ft: 15054 corp: 27/160b lim: 10 exec/s: 40 rss: 74Mb L: 10/10 MS: 1 CopyPart- 00:08:52.239 [2024-11-20 15:10:30.810003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000500 cdw11:00000000 00:08:52.239 [2024-11-20 15:10:30.810029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.239 [2024-11-20 15:10:30.810116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000500 cdw11:00000000 00:08:52.239 [2024-11-20 15:10:30.810131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:52.239 [2024-11-20 15:10:30.810219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00001313 cdw11:00000000 00:08:52.239 [2024-11-20 15:10:30.810234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:52.239 [2024-11-20 15:10:30.810325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ec13 cdw11:00000000 00:08:52.239 [2024-11-20 15:10:30.810357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:52.239 [2024-11-20 15:10:30.810438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:000013ec cdw11:00000000 00:08:52.239 [2024-11-20 15:10:30.810452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:52.239 #41 NEW cov: 12404 ft: 15067 corp: 28/170b lim: 10 exec/s: 41 rss: 74Mb L: 10/10 MS: 1 CopyPart- 00:08:52.239 [2024-11-20 15:10:30.879393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000013ec cdw11:00000000 00:08:52.239 [2024-11-20 15:10:30.879419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.239 #42 NEW cov: 12404 ft: 15081 corp: 29/172b lim: 10 exec/s: 42 rss: 74Mb L: 2/10 MS: 1 EraseBytes- 00:08:52.498 [2024-11-20 15:10:30.950126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000250a cdw11:00000000 00:08:52.498 [2024-11-20 15:10:30.950152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.498 [2024-11-20 15:10:30.950230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:52.498 [2024-11-20 15:10:30.950245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:52.498 [2024-11-20 15:10:30.950332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000fdff cdw11:00000000 00:08:52.498 [2024-11-20 15:10:30.950347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:52.498 #43 NEW cov: 12404 ft: 15097 corp: 30/178b lim: 10 exec/s: 43 rss: 74Mb L: 6/10 MS: 1 EraseBytes- 00:08:52.498 [2024-11-20 15:10:31.020985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:52.498 [2024-11-20 15:10:31.021014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.498 [2024-11-20 15:10:31.021110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:52.498 [2024-11-20 15:10:31.021126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:52.498 [2024-11-20 15:10:31.021215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:52.498 [2024-11-20 15:10:31.021231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:52.498 [2024-11-20 15:10:31.021329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:08:52.498 [2024-11-20 15:10:31.021344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:52.498 [2024-11-20 15:10:31.021431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e802 cdw11:00000000 00:08:52.498 [2024-11-20 15:10:31.021447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:52.498 #44 NEW cov: 12404 ft: 15117 corp: 31/188b lim: 10 exec/s: 44 rss: 74Mb L: 10/10 MS: 1 ChangeBit- 00:08:52.498 [2024-11-20 15:10:31.089998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aef cdw11:00000000 00:08:52.498 [2024-11-20 15:10:31.090027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.498 #45 NEW cov: 12404 ft: 15118 corp: 32/190b lim: 10 exec/s: 22 rss: 74Mb L: 2/10 MS: 1 ChangeBinInt- 00:08:52.498 #45 DONE cov: 12404 ft: 15118 corp: 32/190b lim: 10 exec/s: 22 rss: 74Mb 00:08:52.498 Done 45 runs in 2 second(s) 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:52.758 15:10:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:52.758 [2024-11-20 15:10:31.275615] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:52.758 [2024-11-20 15:10:31.275686] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1475809 ] 00:08:53.017 [2024-11-20 15:10:31.490708] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.017 [2024-11-20 15:10:31.505359] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.017 [2024-11-20 15:10:31.558074] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:53.017 [2024-11-20 15:10:31.574306] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:53.017 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.017 INFO: Seed: 2703022553 00:08:53.017 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:53.017 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:53.017 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:53.017 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.017 #2 INITED exec/s: 0 rss: 66Mb 00:08:53.017 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.017 This may also happen if the target rejected all inputs we tried so far 00:08:53.017 [2024-11-20 15:10:31.622188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:53.017 [2024-11-20 15:10:31.622218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.017 [2024-11-20 15:10:31.622270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:53.017 [2024-11-20 15:10:31.622283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.017 [2024-11-20 15:10:31.622324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:53.017 [2024-11-20 15:10:31.622335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.275 NEW_FUNC[1/714]: 0x464b88 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:53.275 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:53.275 #7 NEW cov: 12185 ft: 12185 corp: 2/7b lim: 10 exec/s: 0 rss: 73Mb L: 6/6 MS: 5 CopyPart-ChangeBit-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:53.534 [2024-11-20 15:10:31.962797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000876 cdw11:00000000 00:08:53.534 [2024-11-20 15:10:31.962834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.534 #9 NEW cov: 12299 ft: 13013 corp: 3/9b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 2 ChangeBit-InsertByte- 00:08:53.534 [2024-11-20 15:10:32.002957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.002985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.534 [2024-11-20 15:10:32.003037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.003051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.534 [2024-11-20 15:10:32.003100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff7f cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.003114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.534 #10 NEW cov: 12305 ft: 13192 corp: 4/15b lim: 10 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 ChangeBit- 00:08:53.534 [2024-11-20 15:10:32.063154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a7f cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.063181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.534 [2024-11-20 15:10:32.063233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.063247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.534 [2024-11-20 15:10:32.063298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff7f cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.063313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.534 #11 NEW cov: 12390 ft: 13428 corp: 5/21b lim: 10 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:08:53.534 [2024-11-20 15:10:32.123305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.123342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.534 [2024-11-20 15:10:32.123391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.123405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.534 [2024-11-20 15:10:32.123457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ad7f cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.123471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.534 #12 NEW cov: 12390 ft: 13589 corp: 6/27b lim: 10 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeByte- 00:08:53.534 [2024-11-20 15:10:32.163396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a7f cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.163423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.534 [2024-11-20 15:10:32.163474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.163488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.534 [2024-11-20 15:10:32.163538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff24 cdw11:00000000 00:08:53.534 [2024-11-20 15:10:32.163552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.534 #13 NEW cov: 12390 ft: 13664 corp: 7/33b lim: 10 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeByte- 00:08:53.792 [2024-11-20 15:10:32.223640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a7f cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.223666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.792 [2024-11-20 15:10:32.223716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff7f cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.223731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.792 [2024-11-20 15:10:32.223780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff24 cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.223794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.792 #14 NEW cov: 12390 ft: 13704 corp: 8/39b lim: 10 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:08:53.792 [2024-11-20 15:10:32.283776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a6a cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.283804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.792 [2024-11-20 15:10:32.283854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.283868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.792 [2024-11-20 15:10:32.283919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.283933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.792 #16 NEW cov: 12390 ft: 13757 corp: 9/46b lim: 10 exec/s: 0 rss: 74Mb L: 7/7 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:53.792 [2024-11-20 15:10:32.323646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005b08 cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.323676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.792 #21 NEW cov: 12390 ft: 13894 corp: 10/49b lim: 10 exec/s: 0 rss: 74Mb L: 3/7 MS: 5 CrossOver-ChangeBit-ChangeBit-ChangeBinInt-CrossOver- 00:08:53.792 [2024-11-20 15:10:32.363990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.364017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.792 [2024-11-20 15:10:32.364068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffbf cdw11:00000000 00:08:53.792 [2024-11-20 15:10:32.364082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.792 [2024-11-20 15:10:32.364132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ad7f cdw11:00000000 00:08:53.793 [2024-11-20 15:10:32.364146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.793 #22 NEW cov: 12390 ft: 14027 corp: 11/55b lim: 10 exec/s: 0 rss: 74Mb L: 6/7 MS: 1 ChangeBit- 00:08:53.793 [2024-11-20 15:10:32.423914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005b30 cdw11:00000000 00:08:53.793 [2024-11-20 15:10:32.423941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.793 #25 NEW cov: 12390 ft: 14043 corp: 12/57b lim: 10 exec/s: 0 rss: 74Mb L: 2/7 MS: 3 CrossOver-ChangeByte-InsertByte- 00:08:53.793 [2024-11-20 15:10:32.464266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a6a cdw11:00000000 00:08:53.793 [2024-11-20 15:10:32.464292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.793 [2024-11-20 15:10:32.464344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff7f cdw11:00000000 00:08:53.793 [2024-11-20 15:10:32.464359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.793 [2024-11-20 15:10:32.464410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff24 cdw11:00000000 00:08:53.793 [2024-11-20 15:10:32.464423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.051 #26 NEW cov: 12390 ft: 14094 corp: 13/63b lim: 10 exec/s: 0 rss: 74Mb L: 6/7 MS: 1 ChangeByte- 00:08:54.051 [2024-11-20 15:10:32.524446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a6a cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.524474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.051 [2024-11-20 15:10:32.524524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.524539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.051 [2024-11-20 15:10:32.524588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006a2a cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.524602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.051 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:54.051 #27 NEW cov: 12413 ft: 14146 corp: 14/70b lim: 10 exec/s: 0 rss: 74Mb L: 7/7 MS: 1 ChangeBit- 00:08:54.051 [2024-11-20 15:10:32.584607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a7f cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.584638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.051 [2024-11-20 15:10:32.584689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff7f cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.584703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.051 [2024-11-20 15:10:32.584753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff24 cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.584767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.051 #28 NEW cov: 12413 ft: 14179 corp: 15/76b lim: 10 exec/s: 0 rss: 74Mb L: 6/7 MS: 1 ChangeByte- 00:08:54.051 [2024-11-20 15:10:32.624701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000824a cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.624729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.051 [2024-11-20 15:10:32.624780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.624794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.051 [2024-11-20 15:10:32.624842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.624857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.051 #29 NEW cov: 12413 ft: 14183 corp: 16/83b lim: 10 exec/s: 29 rss: 74Mb L: 7/7 MS: 1 InsertByte- 00:08:54.051 [2024-11-20 15:10:32.664680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.664707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.051 [2024-11-20 15:10:32.664759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000bfad cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.664772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.051 #30 NEW cov: 12413 ft: 14349 corp: 17/88b lim: 10 exec/s: 30 rss: 74Mb L: 5/7 MS: 1 EraseBytes- 00:08:54.051 [2024-11-20 15:10:32.724837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.724864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.051 [2024-11-20 15:10:32.724917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.051 [2024-11-20 15:10:32.724931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.310 #31 NEW cov: 12413 ft: 14364 corp: 18/93b lim: 10 exec/s: 31 rss: 74Mb L: 5/7 MS: 1 EraseBytes- 00:08:54.310 [2024-11-20 15:10:32.765098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a6a cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.765124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.765176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.765190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.765240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c26a cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.765257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.310 #32 NEW cov: 12413 ft: 14373 corp: 19/100b lim: 10 exec/s: 32 rss: 74Mb L: 7/7 MS: 1 ChangeByte- 00:08:54.310 [2024-11-20 15:10:32.805086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a7f cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.805111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.805165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.805180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.310 #33 NEW cov: 12413 ft: 14389 corp: 20/105b lim: 10 exec/s: 33 rss: 74Mb L: 5/7 MS: 1 EraseBytes- 00:08:54.310 [2024-11-20 15:10:32.845070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000305b cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.845096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.310 #34 NEW cov: 12413 ft: 14394 corp: 21/107b lim: 10 exec/s: 34 rss: 74Mb L: 2/7 MS: 1 ShuffleBytes- 00:08:54.310 [2024-11-20 15:10:32.905500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a6a cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.905527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.905579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.905593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.905643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002a6a cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.905657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.310 #35 NEW cov: 12413 ft: 14407 corp: 22/113b lim: 10 exec/s: 35 rss: 74Mb L: 6/7 MS: 1 CrossOver- 00:08:54.310 [2024-11-20 15:10:32.945590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a6a cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.945616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.945669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.945684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.945735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004aff cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.945750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.310 #36 NEW cov: 12413 ft: 14426 corp: 23/120b lim: 10 exec/s: 36 rss: 74Mb L: 7/7 MS: 1 CrossOver- 00:08:54.310 [2024-11-20 15:10:32.985700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.985726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.985777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffe8 cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.985791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.310 [2024-11-20 15:10:32.985843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffad cdw11:00000000 00:08:54.310 [2024-11-20 15:10:32.985858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.569 #37 NEW cov: 12413 ft: 14441 corp: 24/127b lim: 10 exec/s: 37 rss: 74Mb L: 7/7 MS: 1 InsertByte- 00:08:54.569 [2024-11-20 15:10:33.025666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.025692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.025744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.025758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.569 #38 NEW cov: 12413 ft: 14456 corp: 25/132b lim: 10 exec/s: 38 rss: 74Mb L: 5/7 MS: 1 EraseBytes- 00:08:54.569 [2024-11-20 15:10:33.065925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a6a cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.065953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.066005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009695 cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.066019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.066069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003d95 cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.066084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.569 #39 NEW cov: 12413 ft: 14491 corp: 26/139b lim: 10 exec/s: 39 rss: 74Mb L: 7/7 MS: 1 ChangeBinInt- 00:08:54.569 [2024-11-20 15:10:33.126075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a2a cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.126102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.126155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.126170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.126220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.126235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.569 #40 NEW cov: 12413 ft: 14550 corp: 27/145b lim: 10 exec/s: 40 rss: 74Mb L: 6/7 MS: 1 ShuffleBytes- 00:08:54.569 [2024-11-20 15:10:33.186247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a7f cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.186273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.186329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.186344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.186397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a37f cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.186411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.569 #41 NEW cov: 12413 ft: 14568 corp: 28/151b lim: 10 exec/s: 41 rss: 74Mb L: 6/7 MS: 1 ChangeByte- 00:08:54.569 [2024-11-20 15:10:33.226466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a23 cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.226492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.226546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.226561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.226613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.226628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.569 [2024-11-20 15:10:33.226680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:54.569 [2024-11-20 15:10:33.226693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.569 #42 NEW cov: 12413 ft: 14796 corp: 29/159b lim: 10 exec/s: 42 rss: 74Mb L: 8/8 MS: 1 InsertByte- 00:08:54.828 [2024-11-20 15:10:33.266589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.266615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.266667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.266682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.266733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.266748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.266799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.266813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.828 #43 NEW cov: 12413 ft: 14804 corp: 30/168b lim: 10 exec/s: 43 rss: 74Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:54.828 [2024-11-20 15:10:33.306830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a7f cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.306856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.306907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.306921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.306971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.306985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.307034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.307048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.307098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000007f cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.307116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:54.828 #44 NEW cov: 12413 ft: 14842 corp: 31/178b lim: 10 exec/s: 44 rss: 74Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:54.828 [2024-11-20 15:10:33.346733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cd4a cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.346759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.346813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007fff cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.346827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.346880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffa3 cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.346895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.828 #45 NEW cov: 12413 ft: 14854 corp: 32/185b lim: 10 exec/s: 45 rss: 75Mb L: 7/10 MS: 1 InsertByte- 00:08:54.828 [2024-11-20 15:10:33.406665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000876 cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.406691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.828 #46 NEW cov: 12413 ft: 14872 corp: 33/187b lim: 10 exec/s: 46 rss: 75Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:54.828 [2024-11-20 15:10:33.466918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005d5d cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.466944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.828 [2024-11-20 15:10:33.466996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005d08 cdw11:00000000 00:08:54.828 [2024-11-20 15:10:33.467010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.828 #47 NEW cov: 12413 ft: 14882 corp: 34/192b lim: 10 exec/s: 47 rss: 75Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:08:55.088 [2024-11-20 15:10:33.527177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a6a cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.527205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.088 [2024-11-20 15:10:33.527258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006a6a cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.527272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.088 [2024-11-20 15:10:33.527324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c26a cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.527338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.088 #48 NEW cov: 12413 ft: 14890 corp: 35/199b lim: 10 exec/s: 48 rss: 75Mb L: 7/10 MS: 1 ShuffleBytes- 00:08:55.088 [2024-11-20 15:10:33.567307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.567338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.088 [2024-11-20 15:10:33.567391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a34a cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.567405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.088 [2024-11-20 15:10:33.567460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007f7f cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.567474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.088 #49 NEW cov: 12413 ft: 14892 corp: 36/205b lim: 10 exec/s: 49 rss: 75Mb L: 6/10 MS: 1 ShuffleBytes- 00:08:55.088 [2024-11-20 15:10:33.607456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cd4a cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.607481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.088 [2024-11-20 15:10:33.607544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007fff cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.607558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.088 [2024-11-20 15:10:33.607610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffa3 cdw11:00000000 00:08:55.088 [2024-11-20 15:10:33.607624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.088 #50 NEW cov: 12413 ft: 14901 corp: 37/212b lim: 10 exec/s: 25 rss: 75Mb L: 7/10 MS: 1 CopyPart- 00:08:55.088 #50 DONE cov: 12413 ft: 14901 corp: 37/212b lim: 10 exec/s: 25 rss: 75Mb 00:08:55.088 Done 50 runs in 2 second(s) 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:55.088 15:10:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:55.346 [2024-11-20 15:10:33.788438] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:55.346 [2024-11-20 15:10:33.788509] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1476076 ] 00:08:55.346 [2024-11-20 15:10:34.003260] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.346 [2024-11-20 15:10:34.017800] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.606 [2024-11-20 15:10:34.070828] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:55.606 [2024-11-20 15:10:34.087051] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:55.606 INFO: Running with entropic power schedule (0xFF, 100). 00:08:55.606 INFO: Seed: 921071923 00:08:55.606 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:55.606 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:55.606 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:55.606 INFO: A corpus is not provided, starting from an empty corpus 00:08:55.606 [2024-11-20 15:10:34.157859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.157899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.606 #2 INITED cov: 12205 ft: 12200 corp: 1/1b exec/s: 0 rss: 72Mb 00:08:55.606 [2024-11-20 15:10:34.209378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.209406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.606 [2024-11-20 15:10:34.209514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.209529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.606 [2024-11-20 15:10:34.209636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.209654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.606 [2024-11-20 15:10:34.209749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.209764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:55.606 [2024-11-20 15:10:34.209859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.209874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:55.606 #3 NEW cov: 12318 ft: 13576 corp: 2/6b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:55.606 [2024-11-20 15:10:34.279544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.279569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.606 [2024-11-20 15:10:34.279656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.279669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.606 [2024-11-20 15:10:34.279774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.279792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.606 [2024-11-20 15:10:34.279886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.279900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:55.606 [2024-11-20 15:10:34.279990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.606 [2024-11-20 15:10:34.280005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:55.866 #4 NEW cov: 12324 ft: 13827 corp: 3/11b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:55.866 [2024-11-20 15:10:34.349799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.349825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.349908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.349923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.350009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.350025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.350114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.350129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.350221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.350236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:55.866 #5 NEW cov: 12409 ft: 14046 corp: 4/16b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 CrossOver- 00:08:55.866 [2024-11-20 15:10:34.419660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.419684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.419782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.419796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.419886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.419899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.419993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.420011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:55.866 #6 NEW cov: 12409 ft: 14179 corp: 5/20b lim: 5 exec/s: 0 rss: 72Mb L: 4/5 MS: 1 EraseBytes- 00:08:55.866 [2024-11-20 15:10:34.470265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.470289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.470396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.470411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.470494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.470509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.470595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.470610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.470700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.470715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:55.866 #7 NEW cov: 12409 ft: 14263 corp: 6/25b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:55.866 [2024-11-20 15:10:34.540470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.540494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.540583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.540598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.540697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.540713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.540821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.540842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:55.866 [2024-11-20 15:10:34.540979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.866 [2024-11-20 15:10:34.541000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:56.126 #8 NEW cov: 12409 ft: 14319 corp: 7/30b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 ChangeByte- 00:08:56.126 [2024-11-20 15:10:34.590926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.590955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.591046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.591061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.591142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.591156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.591241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.591256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.591369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.591384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:56.126 #9 NEW cov: 12409 ft: 14362 corp: 8/35b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 ChangeBit- 00:08:56.126 [2024-11-20 15:10:34.660916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.660941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.661031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.661047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.661140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.661155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.661246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.661261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.126 #10 NEW cov: 12409 ft: 14396 corp: 9/39b lim: 5 exec/s: 0 rss: 73Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:56.126 [2024-11-20 15:10:34.711078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.711103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.711186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.711201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.711294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.711312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.711402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.711417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.126 #11 NEW cov: 12409 ft: 14468 corp: 10/43b lim: 5 exec/s: 0 rss: 73Mb L: 4/5 MS: 1 ChangeBit- 00:08:56.126 [2024-11-20 15:10:34.760606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.760632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.126 [2024-11-20 15:10:34.760724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.126 [2024-11-20 15:10:34.760741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.126 #12 NEW cov: 12409 ft: 14714 corp: 11/45b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 CrossOver- 00:08:56.386 [2024-11-20 15:10:34.832131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.832159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.832262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.832278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.832387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.832403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.832492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.832506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.832599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.832614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:56.386 #13 NEW cov: 12409 ft: 14736 corp: 12/50b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 ChangeByte- 00:08:56.386 [2024-11-20 15:10:34.881313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.881343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.881451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.881466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.386 #14 NEW cov: 12409 ft: 14835 corp: 13/52b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:56.386 [2024-11-20 15:10:34.952727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.952753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.952861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.952876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.952964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.952978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.953066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.953079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:34.953165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:34.953180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:56.386 #15 NEW cov: 12409 ft: 14848 corp: 14/57b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:56.386 [2024-11-20 15:10:35.002989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:35.003014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:35.003121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:35.003137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:35.003225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:35.003240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:35.003331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:35.003347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.386 [2024-11-20 15:10:35.003433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.386 [2024-11-20 15:10:35.003447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:56.645 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:56.645 #16 NEW cov: 12432 ft: 14851 corp: 15/62b lim: 5 exec/s: 16 rss: 74Mb L: 5/5 MS: 1 InsertByte- 00:08:56.904 [2024-11-20 15:10:35.332727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.904 [2024-11-20 15:10:35.332776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.905 #17 NEW cov: 12432 ft: 14937 corp: 16/63b lim: 5 exec/s: 17 rss: 74Mb L: 1/5 MS: 1 EraseBytes- 00:08:56.905 [2024-11-20 15:10:35.403800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.403826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.905 [2024-11-20 15:10:35.403916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.403931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.905 [2024-11-20 15:10:35.404019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.404034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.905 [2024-11-20 15:10:35.404130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.404145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.905 #18 NEW cov: 12432 ft: 14949 corp: 17/67b lim: 5 exec/s: 18 rss: 74Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:56.905 [2024-11-20 15:10:35.473579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.473604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.905 [2024-11-20 15:10:35.473688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.473704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.905 #19 NEW cov: 12432 ft: 14953 corp: 18/69b lim: 5 exec/s: 19 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:08:56.905 [2024-11-20 15:10:35.524923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.524948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.905 [2024-11-20 15:10:35.525035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.525050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.905 [2024-11-20 15:10:35.525142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.525155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.905 [2024-11-20 15:10:35.525243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.905 [2024-11-20 15:10:35.525257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.905 #20 NEW cov: 12432 ft: 15021 corp: 19/73b lim: 5 exec/s: 20 rss: 75Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:57.165 [2024-11-20 15:10:35.595825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.165 [2024-11-20 15:10:35.595854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.165 [2024-11-20 15:10:35.595945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.165 [2024-11-20 15:10:35.595960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.165 [2024-11-20 15:10:35.596060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.165 [2024-11-20 15:10:35.596076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.165 [2024-11-20 15:10:35.596183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.165 [2024-11-20 15:10:35.596204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.165 [2024-11-20 15:10:35.596327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.165 [2024-11-20 15:10:35.596361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:57.165 #21 NEW cov: 12432 ft: 15028 corp: 20/78b lim: 5 exec/s: 21 rss: 75Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:57.165 [2024-11-20 15:10:35.665605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.165 [2024-11-20 15:10:35.665631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.165 [2024-11-20 15:10:35.665723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.165 [2024-11-20 15:10:35.665738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.165 [2024-11-20 15:10:35.665830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.165 [2024-11-20 15:10:35.665845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.165 [2024-11-20 15:10:35.665938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.665952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.166 #22 NEW cov: 12432 ft: 15079 corp: 21/82b lim: 5 exec/s: 22 rss: 75Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:57.166 [2024-11-20 15:10:35.736099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.736123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.736218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.736232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.736330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.736365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.736458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.736473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.736562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.736577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:57.166 #23 NEW cov: 12432 ft: 15092 corp: 22/87b lim: 5 exec/s: 23 rss: 75Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:57.166 [2024-11-20 15:10:35.785646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.785672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.785757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.785772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.785861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.785875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.166 #24 NEW cov: 12432 ft: 15250 corp: 23/90b lim: 5 exec/s: 24 rss: 75Mb L: 3/5 MS: 1 EraseBytes- 00:08:57.166 [2024-11-20 15:10:35.836501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.836525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.836609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.836623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.836703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.836717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.836803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.836817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.166 [2024-11-20 15:10:35.836908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.166 [2024-11-20 15:10:35.836923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:57.426 #25 NEW cov: 12432 ft: 15268 corp: 24/95b lim: 5 exec/s: 25 rss: 75Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:57.426 [2024-11-20 15:10:35.906674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:35.906699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:35.906789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:35.906804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:35.906919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:35.906934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:35.907017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:35.907031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.426 #26 NEW cov: 12432 ft: 15277 corp: 25/99b lim: 5 exec/s: 26 rss: 75Mb L: 4/5 MS: 1 CrossOver- 00:08:57.426 [2024-11-20 15:10:35.956435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:35.956460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:35.956544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:35.956559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:35.956647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:35.956661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.426 #27 NEW cov: 12432 ft: 15285 corp: 26/102b lim: 5 exec/s: 27 rss: 75Mb L: 3/5 MS: 1 EraseBytes- 00:08:57.426 [2024-11-20 15:10:36.006676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.006702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:36.006788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.006804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:36.006897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.006913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.426 #28 NEW cov: 12432 ft: 15329 corp: 27/105b lim: 5 exec/s: 28 rss: 75Mb L: 3/5 MS: 1 CrossOver- 00:08:57.426 [2024-11-20 15:10:36.057485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.057510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:36.057618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.057633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:36.057724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.057740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:36.057833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.057847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:36.057939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.057953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:57.426 #29 NEW cov: 12432 ft: 15344 corp: 28/110b lim: 5 exec/s: 29 rss: 75Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:57.426 [2024-11-20 15:10:36.107764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.107792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:36.107888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.426 [2024-11-20 15:10:36.107904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.426 [2024-11-20 15:10:36.107988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.427 [2024-11-20 15:10:36.108005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.427 [2024-11-20 15:10:36.108096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.427 [2024-11-20 15:10:36.108112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.427 [2024-11-20 15:10:36.108208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.427 [2024-11-20 15:10:36.108224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:57.686 #30 NEW cov: 12432 ft: 15360 corp: 29/115b lim: 5 exec/s: 15 rss: 75Mb L: 5/5 MS: 1 InsertByte- 00:08:57.686 #30 DONE cov: 12432 ft: 15360 corp: 29/115b lim: 5 exec/s: 15 rss: 75Mb 00:08:57.686 Done 30 runs in 2 second(s) 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:57.686 15:10:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:57.686 [2024-11-20 15:10:36.293145] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:08:57.686 [2024-11-20 15:10:36.293226] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1476417 ] 00:08:57.946 [2024-11-20 15:10:36.507852] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:57.946 [2024-11-20 15:10:36.522746] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.946 [2024-11-20 15:10:36.575488] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:57.946 [2024-11-20 15:10:36.591730] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:57.946 INFO: Running with entropic power schedule (0xFF, 100). 00:08:57.946 INFO: Seed: 3428064036 00:08:58.205 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:08:58.205 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:08:58.205 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:58.205 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.205 [2024-11-20 15:10:36.669306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.205 [2024-11-20 15:10:36.669357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.205 #2 INITED cov: 12205 ft: 12205 corp: 1/1b exec/s: 0 rss: 72Mb 00:08:58.205 [2024-11-20 15:10:36.719262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.205 [2024-11-20 15:10:36.719294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.205 #3 NEW cov: 12318 ft: 12754 corp: 2/2b lim: 5 exec/s: 0 rss: 73Mb L: 1/1 MS: 1 CopyPart- 00:08:58.205 [2024-11-20 15:10:36.789480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.205 [2024-11-20 15:10:36.789508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.205 #4 NEW cov: 12324 ft: 12960 corp: 3/3b lim: 5 exec/s: 0 rss: 73Mb L: 1/1 MS: 1 ChangeBit- 00:08:58.205 [2024-11-20 15:10:36.839589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.205 [2024-11-20 15:10:36.839617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.205 #5 NEW cov: 12409 ft: 13241 corp: 4/4b lim: 5 exec/s: 0 rss: 73Mb L: 1/1 MS: 1 ChangeByte- 00:08:58.465 [2024-11-20 15:10:36.911039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.911068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.465 [2024-11-20 15:10:36.911166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.911182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.465 [2024-11-20 15:10:36.911274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.911292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.465 [2024-11-20 15:10:36.911397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.911413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.465 #6 NEW cov: 12409 ft: 14095 corp: 5/8b lim: 5 exec/s: 0 rss: 73Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:58.465 [2024-11-20 15:10:36.981681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.981708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.465 [2024-11-20 15:10:36.981799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.981815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.465 [2024-11-20 15:10:36.981913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.981930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.465 [2024-11-20 15:10:36.982034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.982051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.465 [2024-11-20 15:10:36.982144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:36.982160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:58.465 #7 NEW cov: 12409 ft: 14253 corp: 6/13b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:58.465 [2024-11-20 15:10:37.050563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:37.050594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.465 #8 NEW cov: 12409 ft: 14326 corp: 7/14b lim: 5 exec/s: 0 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:08:58.465 [2024-11-20 15:10:37.101138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:37.101165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.465 [2024-11-20 15:10:37.101258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.465 [2024-11-20 15:10:37.101275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.465 #9 NEW cov: 12409 ft: 14559 corp: 8/16b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 EraseBytes- 00:08:58.725 [2024-11-20 15:10:37.171609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.171636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.725 [2024-11-20 15:10:37.171726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.171742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.725 #10 NEW cov: 12409 ft: 14602 corp: 9/18b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 CopyPart- 00:08:58.725 [2024-11-20 15:10:37.221918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.221945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.725 [2024-11-20 15:10:37.222046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.222065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.725 #11 NEW cov: 12409 ft: 14678 corp: 10/20b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:08:58.725 [2024-11-20 15:10:37.291865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.291893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.725 #12 NEW cov: 12409 ft: 14712 corp: 11/21b lim: 5 exec/s: 0 rss: 73Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:58.725 [2024-11-20 15:10:37.343199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.343227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.725 [2024-11-20 15:10:37.343334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.343363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.725 [2024-11-20 15:10:37.343455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.343474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.725 [2024-11-20 15:10:37.343560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.725 [2024-11-20 15:10:37.343577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.725 #13 NEW cov: 12409 ft: 14746 corp: 12/25b lim: 5 exec/s: 0 rss: 73Mb L: 4/5 MS: 1 CopyPart- 00:08:58.985 [2024-11-20 15:10:37.412475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.985 [2024-11-20 15:10:37.412505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.985 #14 NEW cov: 12409 ft: 14787 corp: 13/26b lim: 5 exec/s: 0 rss: 73Mb L: 1/5 MS: 1 CopyPart- 00:08:58.985 [2024-11-20 15:10:37.463361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.985 [2024-11-20 15:10:37.463389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.985 [2024-11-20 15:10:37.463478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.985 [2024-11-20 15:10:37.463494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.985 [2024-11-20 15:10:37.463586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.985 [2024-11-20 15:10:37.463603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.985 #15 NEW cov: 12409 ft: 14973 corp: 14/29b lim: 5 exec/s: 0 rss: 73Mb L: 3/5 MS: 1 CrossOver- 00:08:58.985 [2024-11-20 15:10:37.513193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.985 [2024-11-20 15:10:37.513219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.985 [2024-11-20 15:10:37.513310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.985 [2024-11-20 15:10:37.513330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.244 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:59.244 #16 NEW cov: 12432 ft: 15008 corp: 15/31b lim: 5 exec/s: 16 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:08:59.244 [2024-11-20 15:10:37.845042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.244 [2024-11-20 15:10:37.845083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.244 [2024-11-20 15:10:37.845180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.244 [2024-11-20 15:10:37.845197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.244 [2024-11-20 15:10:37.845287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.244 [2024-11-20 15:10:37.845304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:59.244 [2024-11-20 15:10:37.845413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.244 [2024-11-20 15:10:37.845431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:59.244 #17 NEW cov: 12432 ft: 15037 corp: 16/35b lim: 5 exec/s: 17 rss: 74Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:59.244 [2024-11-20 15:10:37.904324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.244 [2024-11-20 15:10:37.904355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.244 [2024-11-20 15:10:37.904459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.244 [2024-11-20 15:10:37.904478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.503 #18 NEW cov: 12432 ft: 15053 corp: 17/37b lim: 5 exec/s: 18 rss: 74Mb L: 2/5 MS: 1 EraseBytes- 00:08:59.503 [2024-11-20 15:10:37.974492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.503 [2024-11-20 15:10:37.974521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.503 #19 NEW cov: 12432 ft: 15089 corp: 18/38b lim: 5 exec/s: 19 rss: 74Mb L: 1/5 MS: 1 EraseBytes- 00:08:59.503 [2024-11-20 15:10:38.044694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.503 [2024-11-20 15:10:38.044722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.503 #20 NEW cov: 12432 ft: 15129 corp: 19/39b lim: 5 exec/s: 20 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:08:59.503 [2024-11-20 15:10:38.115511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.503 [2024-11-20 15:10:38.115538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.503 [2024-11-20 15:10:38.115636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.503 [2024-11-20 15:10:38.115651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.503 #21 NEW cov: 12432 ft: 15155 corp: 20/41b lim: 5 exec/s: 21 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:08:59.503 [2024-11-20 15:10:38.165765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.503 [2024-11-20 15:10:38.165792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.503 [2024-11-20 15:10:38.165883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.503 [2024-11-20 15:10:38.165899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.763 #22 NEW cov: 12432 ft: 15206 corp: 21/43b lim: 5 exec/s: 22 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:08:59.763 [2024-11-20 15:10:38.216320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.763 [2024-11-20 15:10:38.216348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.763 [2024-11-20 15:10:38.216446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.763 [2024-11-20 15:10:38.216462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.763 #23 NEW cov: 12432 ft: 15216 corp: 22/45b lim: 5 exec/s: 23 rss: 74Mb L: 2/5 MS: 1 InsertByte- 00:08:59.763 [2024-11-20 15:10:38.266350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.763 [2024-11-20 15:10:38.266378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.763 #24 NEW cov: 12432 ft: 15218 corp: 23/46b lim: 5 exec/s: 24 rss: 74Mb L: 1/5 MS: 1 CrossOver- 00:08:59.763 [2024-11-20 15:10:38.337371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.763 [2024-11-20 15:10:38.337399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.763 [2024-11-20 15:10:38.337496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.763 [2024-11-20 15:10:38.337514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.763 #25 NEW cov: 12432 ft: 15226 corp: 24/48b lim: 5 exec/s: 25 rss: 74Mb L: 2/5 MS: 1 ChangeBit- 00:08:59.763 [2024-11-20 15:10:38.387708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.763 [2024-11-20 15:10:38.387736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.763 [2024-11-20 15:10:38.387829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.763 [2024-11-20 15:10:38.387845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.763 #26 NEW cov: 12432 ft: 15281 corp: 25/50b lim: 5 exec/s: 26 rss: 74Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:59.764 [2024-11-20 15:10:38.438588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.764 [2024-11-20 15:10:38.438614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.764 [2024-11-20 15:10:38.438712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.764 [2024-11-20 15:10:38.438728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.764 [2024-11-20 15:10:38.438828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.764 [2024-11-20 15:10:38.438847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.023 #27 NEW cov: 12432 ft: 15287 corp: 26/53b lim: 5 exec/s: 27 rss: 74Mb L: 3/5 MS: 1 InsertByte- 00:09:00.023 [2024-11-20 15:10:38.489545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.489573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.023 [2024-11-20 15:10:38.489672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.489690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.023 [2024-11-20 15:10:38.489792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.489809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.023 [2024-11-20 15:10:38.489902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.489920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.023 [2024-11-20 15:10:38.490016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.490032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.023 #28 NEW cov: 12432 ft: 15295 corp: 27/58b lim: 5 exec/s: 28 rss: 75Mb L: 5/5 MS: 1 CMP- DE: "\001\000\000\000"- 00:09:00.023 [2024-11-20 15:10:38.548721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.548748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.023 [2024-11-20 15:10:38.548833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.548851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.023 #29 NEW cov: 12432 ft: 15320 corp: 28/60b lim: 5 exec/s: 29 rss: 75Mb L: 2/5 MS: 1 CrossOver- 00:09:00.023 [2024-11-20 15:10:38.619469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.619495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.023 [2024-11-20 15:10:38.619600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.619619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.023 [2024-11-20 15:10:38.619711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.023 [2024-11-20 15:10:38.619727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.023 #30 NEW cov: 12432 ft: 15336 corp: 29/63b lim: 5 exec/s: 15 rss: 75Mb L: 3/5 MS: 1 InsertByte- 00:09:00.023 #30 DONE cov: 12432 ft: 15336 corp: 29/63b lim: 5 exec/s: 15 rss: 75Mb 00:09:00.023 ###### Recommended dictionary. ###### 00:09:00.023 "\001\000\000\000" # Uses: 0 00:09:00.023 ###### End of recommended dictionary. ###### 00:09:00.023 Done 30 runs in 2 second(s) 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:00.283 15:10:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:09:00.283 [2024-11-20 15:10:38.801129] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:00.283 [2024-11-20 15:10:38.801200] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1476768 ] 00:09:00.541 [2024-11-20 15:10:39.012741] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.541 [2024-11-20 15:10:39.027257] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.541 [2024-11-20 15:10:39.079980] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:00.541 [2024-11-20 15:10:39.096225] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:09:00.541 INFO: Running with entropic power schedule (0xFF, 100). 00:09:00.541 INFO: Seed: 1637095400 00:09:00.541 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:00.541 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:00.541 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:00.541 INFO: A corpus is not provided, starting from an empty corpus 00:09:00.541 #2 INITED exec/s: 0 rss: 66Mb 00:09:00.541 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:00.541 This may also happen if the target rejected all inputs we tried so far 00:09:00.541 [2024-11-20 15:10:39.173560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:00.541 [2024-11-20 15:10:39.173602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.108 NEW_FUNC[1/713]: 0x466508 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:09:01.108 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:01.108 #4 NEW cov: 12198 ft: 12193 corp: 2/10b lim: 40 exec/s: 0 rss: 73Mb L: 9/9 MS: 2 InsertRepeatedBytes-CopyPart- 00:09:01.108 [2024-11-20 15:10:39.524220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.108 [2024-11-20 15:10:39.524267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.108 NEW_FUNC[1/2]: 0x1c57348 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:932 00:09:01.108 NEW_FUNC[2/2]: 0x1fa2c38 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:820 00:09:01.108 #6 NEW cov: 12340 ft: 12877 corp: 3/21b lim: 40 exec/s: 0 rss: 74Mb L: 11/11 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:01.108 [2024-11-20 15:10:39.584588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffffff cdw11:ffff10ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.108 [2024-11-20 15:10:39.584615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.108 #7 NEW cov: 12346 ft: 13123 corp: 4/33b lim: 40 exec/s: 0 rss: 74Mb L: 12/12 MS: 1 InsertByte- 00:09:01.108 [2024-11-20 15:10:39.655042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:36ffffff cdw11:ffff10ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.108 [2024-11-20 15:10:39.655068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.108 #8 NEW cov: 12431 ft: 13441 corp: 5/45b lim: 40 exec/s: 0 rss: 74Mb L: 12/12 MS: 1 ChangeBit- 00:09:01.108 [2024-11-20 15:10:39.725577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffffff cdw11:ffff24ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.108 [2024-11-20 15:10:39.725603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.108 #9 NEW cov: 12431 ft: 13524 corp: 6/57b lim: 40 exec/s: 0 rss: 74Mb L: 12/12 MS: 1 ChangeByte- 00:09:01.108 [2024-11-20 15:10:39.776021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffffff cdw11:ffff10ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.108 [2024-11-20 15:10:39.776049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.367 #10 NEW cov: 12431 ft: 13647 corp: 7/69b lim: 40 exec/s: 0 rss: 74Mb L: 12/12 MS: 1 ChangeBinInt- 00:09:01.367 [2024-11-20 15:10:39.826104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:20000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.367 [2024-11-20 15:10:39.826131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.367 #11 NEW cov: 12431 ft: 13691 corp: 8/78b lim: 40 exec/s: 0 rss: 74Mb L: 9/12 MS: 1 ChangeBit- 00:09:01.367 [2024-11-20 15:10:39.896612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffff60 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.367 [2024-11-20 15:10:39.896641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.367 #12 NEW cov: 12431 ft: 13707 corp: 9/89b lim: 40 exec/s: 0 rss: 74Mb L: 11/12 MS: 1 ChangeByte- 00:09:01.367 [2024-11-20 15:10:39.946637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffffff cdw11:ffff10ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.367 [2024-11-20 15:10:39.946666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.367 #13 NEW cov: 12431 ft: 13716 corp: 10/101b lim: 40 exec/s: 0 rss: 74Mb L: 12/12 MS: 1 ChangeByte- 00:09:01.367 [2024-11-20 15:10:40.017008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:20000000 cdw11:00010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.367 [2024-11-20 15:10:40.017040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.626 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:01.626 #14 NEW cov: 12454 ft: 13834 corp: 11/110b lim: 40 exec/s: 0 rss: 74Mb L: 9/12 MS: 1 ChangeBit- 00:09:01.626 [2024-11-20 15:10:40.087450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffffff cdw11:fffffff6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.626 [2024-11-20 15:10:40.087478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.626 #15 NEW cov: 12454 ft: 13882 corp: 12/121b lim: 40 exec/s: 0 rss: 74Mb L: 11/12 MS: 1 ChangeBinInt- 00:09:01.626 [2024-11-20 15:10:40.137576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffffff cdw11:ffff24ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.626 [2024-11-20 15:10:40.137604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.626 #16 NEW cov: 12454 ft: 13969 corp: 13/133b lim: 40 exec/s: 16 rss: 74Mb L: 12/12 MS: 1 ShuffleBytes- 00:09:01.626 [2024-11-20 15:10:40.207846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2626ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.626 [2024-11-20 15:10:40.207872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.626 #17 NEW cov: 12454 ft: 13994 corp: 14/147b lim: 40 exec/s: 17 rss: 74Mb L: 14/14 MS: 1 CrossOver- 00:09:01.626 [2024-11-20 15:10:40.258037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2626ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.626 [2024-11-20 15:10:40.258064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.626 #18 NEW cov: 12454 ft: 13998 corp: 15/161b lim: 40 exec/s: 18 rss: 74Mb L: 14/14 MS: 1 CopyPart- 00:09:01.885 [2024-11-20 15:10:40.328303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d6000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.885 [2024-11-20 15:10:40.328335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.885 #19 NEW cov: 12454 ft: 14037 corp: 16/172b lim: 40 exec/s: 19 rss: 74Mb L: 11/14 MS: 1 ChangeBinInt- 00:09:01.885 [2024-11-20 15:10:40.378446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26ffffff cdw11:ffff10ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.885 [2024-11-20 15:10:40.378474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.885 #20 NEW cov: 12454 ft: 14051 corp: 17/184b lim: 40 exec/s: 20 rss: 74Mb L: 12/14 MS: 1 ChangeByte- 00:09:01.885 [2024-11-20 15:10:40.428610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26f7ffff cdw11:ffffff10 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.885 [2024-11-20 15:10:40.428639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.885 #21 NEW cov: 12454 ft: 14090 corp: 18/197b lim: 40 exec/s: 21 rss: 74Mb L: 13/14 MS: 1 InsertByte- 00:09:01.885 [2024-11-20 15:10:40.498967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:20000000 cdw11:00010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.885 [2024-11-20 15:10:40.498995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.885 #22 NEW cov: 12454 ft: 14134 corp: 19/206b lim: 40 exec/s: 22 rss: 74Mb L: 9/14 MS: 1 ShuffleBytes- 00:09:01.885 [2024-11-20 15:10:40.569741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.886 [2024-11-20 15:10:40.569775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.886 [2024-11-20 15:10:40.569875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.886 [2024-11-20 15:10:40.569894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.886 [2024-11-20 15:10:40.569994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0100000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:01.886 [2024-11-20 15:10:40.570012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:02.144 #23 NEW cov: 12454 ft: 14459 corp: 20/230b lim: 40 exec/s: 23 rss: 74Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:09:02.144 [2024-11-20 15:10:40.640789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:36ffffff cdw11:18181818 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.640817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.144 [2024-11-20 15:10:40.640929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:18181818 cdw11:18181818 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.640946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.144 [2024-11-20 15:10:40.641039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:18181818 cdw11:18181818 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.641058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:02.144 [2024-11-20 15:10:40.641161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:18181818 cdw11:18181818 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.641178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:02.144 [2024-11-20 15:10:40.641278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:ffff10ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.641296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:02.144 #24 NEW cov: 12454 ft: 14981 corp: 21/270b lim: 40 exec/s: 24 rss: 75Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:09:02.144 [2024-11-20 15:10:40.710665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26260000 cdw11:00002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.710692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.144 [2024-11-20 15:10:40.710783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.710800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.144 [2024-11-20 15:10:40.710903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff10 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.710921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:02.144 #25 NEW cov: 12454 ft: 15013 corp: 22/295b lim: 40 exec/s: 25 rss: 75Mb L: 25/40 MS: 1 CrossOver- 00:09:02.144 [2024-11-20 15:10:40.780703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2626ffff cdw11:ffffffef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.780733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.144 [2024-11-20 15:10:40.780827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.780845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.144 [2024-11-20 15:10:40.780935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:efefffff cdw11:ffffff10 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.144 [2024-11-20 15:10:40.780951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:02.144 #26 NEW cov: 12454 ft: 15043 corp: 23/320b lim: 40 exec/s: 26 rss: 75Mb L: 25/40 MS: 1 InsertRepeatedBytes- 00:09:02.403 [2024-11-20 15:10:40.830990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:40.831017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.403 [2024-11-20 15:10:40.831112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:40.831128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.403 [2024-11-20 15:10:40.831217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0100000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:40.831232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:02.403 #27 NEW cov: 12454 ft: 15063 corp: 24/344b lim: 40 exec/s: 27 rss: 75Mb L: 24/40 MS: 1 CopyPart- 00:09:02.403 [2024-11-20 15:10:40.900689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:23ffffff cdw11:ffff10ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:40.900715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.403 #28 NEW cov: 12454 ft: 15081 corp: 25/356b lim: 40 exec/s: 28 rss: 75Mb L: 12/40 MS: 1 ChangeBinInt- 00:09:02.403 [2024-11-20 15:10:40.971571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2626ffff cdw11:ffffffef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:40.971599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.403 [2024-11-20 15:10:40.971698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:40.971716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.403 [2024-11-20 15:10:40.971812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:efefffff cdw11:bfffff10 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:40.971828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:02.403 #29 NEW cov: 12454 ft: 15084 corp: 26/381b lim: 40 exec/s: 29 rss: 75Mb L: 25/40 MS: 1 ChangeBit- 00:09:02.403 [2024-11-20 15:10:41.041723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2626ffff cdw11:ffffffef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:41.041752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.403 [2024-11-20 15:10:41.041844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:41.041859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.403 [2024-11-20 15:10:41.041950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:efefffff cdw11:c0ffff10 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.403 [2024-11-20 15:10:41.041966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:02.403 #30 NEW cov: 12454 ft: 15098 corp: 27/406b lim: 40 exec/s: 30 rss: 75Mb L: 25/40 MS: 1 ChangeByte- 00:09:02.662 [2024-11-20 15:10:41.091299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:26f7ffff cdw11:10ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.662 [2024-11-20 15:10:41.091330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.662 #31 NEW cov: 12454 ft: 15119 corp: 28/416b lim: 40 exec/s: 31 rss: 75Mb L: 10/40 MS: 1 EraseBytes- 00:09:02.662 [2024-11-20 15:10:41.161513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:20000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.662 [2024-11-20 15:10:41.161539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.662 #32 pulse cov: 12454 ft: 15128 corp: 28/416b lim: 40 exec/s: 16 rss: 75Mb 00:09:02.662 #32 NEW cov: 12454 ft: 15128 corp: 29/425b lim: 40 exec/s: 16 rss: 75Mb L: 9/40 MS: 1 CopyPart- 00:09:02.662 #32 DONE cov: 12454 ft: 15128 corp: 29/425b lim: 40 exec/s: 16 rss: 75Mb 00:09:02.662 Done 32 runs in 2 second(s) 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:02.662 15:10:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:09:02.662 [2024-11-20 15:10:41.325980] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:02.662 [2024-11-20 15:10:41.326061] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1477122 ] 00:09:02.920 [2024-11-20 15:10:41.546666] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:02.920 [2024-11-20 15:10:41.561163] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.177 [2024-11-20 15:10:41.613931] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:03.177 [2024-11-20 15:10:41.630123] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:09:03.177 INFO: Running with entropic power schedule (0xFF, 100). 00:09:03.177 INFO: Seed: 4170087650 00:09:03.177 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:03.177 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:03.177 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:03.177 INFO: A corpus is not provided, starting from an empty corpus 00:09:03.177 #2 INITED exec/s: 0 rss: 66Mb 00:09:03.177 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:03.177 This may also happen if the target rejected all inputs we tried so far 00:09:03.177 [2024-11-20 15:10:41.685829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.177 [2024-11-20 15:10:41.685859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.177 [2024-11-20 15:10:41.685922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.177 [2024-11-20 15:10:41.685936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.435 NEW_FUNC[1/716]: 0x468278 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:09:03.436 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:03.436 #4 NEW cov: 12239 ft: 12237 corp: 2/17b lim: 40 exec/s: 0 rss: 73Mb L: 16/16 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:03.436 [2024-11-20 15:10:42.026643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.436 [2024-11-20 15:10:42.026680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.436 #5 NEW cov: 12353 ft: 13531 corp: 3/29b lim: 40 exec/s: 0 rss: 73Mb L: 12/16 MS: 1 EraseBytes- 00:09:03.436 [2024-11-20 15:10:42.086855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.436 [2024-11-20 15:10:42.086884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.436 [2024-11-20 15:10:42.086947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.436 [2024-11-20 15:10:42.086964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.436 #6 NEW cov: 12359 ft: 13907 corp: 4/45b lim: 40 exec/s: 0 rss: 73Mb L: 16/16 MS: 1 ShuffleBytes- 00:09:03.694 [2024-11-20 15:10:42.126894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.126925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.694 [2024-11-20 15:10:42.126989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:26ffffff cdw11:ffffff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.127004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.694 #7 NEW cov: 12444 ft: 14182 corp: 5/61b lim: 40 exec/s: 0 rss: 74Mb L: 16/16 MS: 1 ChangeByte- 00:09:03.694 [2024-11-20 15:10:42.187128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.187153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.694 [2024-11-20 15:10:42.187214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0affff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.187229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.694 #8 NEW cov: 12444 ft: 14332 corp: 6/77b lim: 40 exec/s: 0 rss: 74Mb L: 16/16 MS: 1 CrossOver- 00:09:03.694 [2024-11-20 15:10:42.227351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff26 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.227377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.694 [2024-11-20 15:10:42.227438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff79ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.227452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.694 [2024-11-20 15:10:42.227512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff26ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.227527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.694 #9 NEW cov: 12444 ft: 14619 corp: 7/107b lim: 40 exec/s: 0 rss: 74Mb L: 30/30 MS: 1 CopyPart- 00:09:03.694 [2024-11-20 15:10:42.287407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.287433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.694 [2024-11-20 15:10:42.287492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.287506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.694 #10 NEW cov: 12444 ft: 14790 corp: 8/123b lim: 40 exec/s: 0 rss: 74Mb L: 16/30 MS: 1 ShuffleBytes- 00:09:03.694 [2024-11-20 15:10:42.327324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.694 [2024-11-20 15:10:42.327351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.694 #11 NEW cov: 12444 ft: 14883 corp: 9/136b lim: 40 exec/s: 0 rss: 74Mb L: 13/30 MS: 1 EraseBytes- 00:09:03.952 [2024-11-20 15:10:42.388018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.388044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.388107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.388122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.388179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.388193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.388249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0c0c0cff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.388263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.952 #12 NEW cov: 12444 ft: 15277 corp: 10/172b lim: 40 exec/s: 0 rss: 74Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:09:03.952 [2024-11-20 15:10:42.448011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.448037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.448097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.448111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.448167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.448181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.952 #13 NEW cov: 12444 ft: 15318 corp: 11/200b lim: 40 exec/s: 0 rss: 74Mb L: 28/36 MS: 1 InsertRepeatedBytes- 00:09:03.952 [2024-11-20 15:10:42.487952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.487978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.488036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0affff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.488050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.952 #14 NEW cov: 12444 ft: 15329 corp: 12/216b lim: 40 exec/s: 0 rss: 74Mb L: 16/36 MS: 1 CrossOver- 00:09:03.952 [2024-11-20 15:10:42.527871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.527897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.952 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:03.952 #15 NEW cov: 12467 ft: 15379 corp: 13/228b lim: 40 exec/s: 0 rss: 74Mb L: 12/36 MS: 1 EraseBytes- 00:09:03.952 [2024-11-20 15:10:42.588568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.588594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.588658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0c2a0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.588672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.588732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.588747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.952 [2024-11-20 15:10:42.588806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.952 [2024-11-20 15:10:42.588820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.952 #16 NEW cov: 12467 ft: 15434 corp: 14/265b lim: 40 exec/s: 0 rss: 74Mb L: 37/37 MS: 1 InsertByte- 00:09:04.210 [2024-11-20 15:10:42.648249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.648275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.210 #17 NEW cov: 12467 ft: 15480 corp: 15/275b lim: 40 exec/s: 17 rss: 74Mb L: 10/37 MS: 1 EraseBytes- 00:09:04.210 [2024-11-20 15:10:42.708563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:aeffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.708589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.210 [2024-11-20 15:10:42.708650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.708665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.210 #18 NEW cov: 12467 ft: 15509 corp: 16/291b lim: 40 exec/s: 18 rss: 74Mb L: 16/37 MS: 1 ChangeByte- 00:09:04.210 [2024-11-20 15:10:42.748646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.748671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.210 [2024-11-20 15:10:42.748730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.748744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.210 #19 NEW cov: 12467 ft: 15516 corp: 17/307b lim: 40 exec/s: 19 rss: 74Mb L: 16/37 MS: 1 CrossOver- 00:09:04.210 [2024-11-20 15:10:42.789081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.789106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.210 [2024-11-20 15:10:42.789168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.789183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.210 [2024-11-20 15:10:42.789244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.789258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.210 [2024-11-20 15:10:42.789324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0c0c0cff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.789338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.210 #20 NEW cov: 12467 ft: 15555 corp: 18/343b lim: 40 exec/s: 20 rss: 74Mb L: 36/37 MS: 1 CopyPart- 00:09:04.210 [2024-11-20 15:10:42.828869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.828895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.210 [2024-11-20 15:10:42.828957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.210 [2024-11-20 15:10:42.828971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.211 #21 NEW cov: 12467 ft: 15615 corp: 19/364b lim: 40 exec/s: 21 rss: 74Mb L: 21/37 MS: 1 CrossOver- 00:09:04.211 [2024-11-20 15:10:42.889111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.211 [2024-11-20 15:10:42.889136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.211 [2024-11-20 15:10:42.889197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.211 [2024-11-20 15:10:42.889211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.469 #22 NEW cov: 12467 ft: 15645 corp: 20/383b lim: 40 exec/s: 22 rss: 74Mb L: 19/37 MS: 1 CrossOver- 00:09:04.469 [2024-11-20 15:10:42.949414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff26 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:42.949439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.469 [2024-11-20 15:10:42.949501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff79ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:42.949516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.469 [2024-11-20 15:10:42.949574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff26ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:42.949588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.469 #23 NEW cov: 12467 ft: 15657 corp: 21/413b lim: 40 exec/s: 23 rss: 74Mb L: 30/37 MS: 1 CrossOver- 00:09:04.469 [2024-11-20 15:10:43.009396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:43.009421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.469 [2024-11-20 15:10:43.009485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0045249a cdw11:ad8cb8bc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:43.009502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.469 #24 NEW cov: 12467 ft: 15693 corp: 22/429b lim: 40 exec/s: 24 rss: 74Mb L: 16/37 MS: 1 CMP- DE: "\000E$\232\255\214\270\274"- 00:09:04.469 [2024-11-20 15:10:43.049336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:43.049361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.469 #25 NEW cov: 12467 ft: 15719 corp: 23/442b lim: 40 exec/s: 25 rss: 74Mb L: 13/37 MS: 1 ChangeBit- 00:09:04.469 [2024-11-20 15:10:43.089646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:43.089672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.469 [2024-11-20 15:10:43.089734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff2e79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:43.089749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.469 #26 NEW cov: 12467 ft: 15739 corp: 24/458b lim: 40 exec/s: 26 rss: 74Mb L: 16/37 MS: 1 ChangeByte- 00:09:04.469 [2024-11-20 15:10:43.129756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:43.129782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.469 [2024-11-20 15:10:43.129848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffd7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.469 [2024-11-20 15:10:43.129863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.469 #27 NEW cov: 12467 ft: 15750 corp: 25/475b lim: 40 exec/s: 27 rss: 75Mb L: 17/37 MS: 1 InsertByte- 00:09:04.728 [2024-11-20 15:10:43.169819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.169845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.169908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff79ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.169922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.728 #28 NEW cov: 12467 ft: 15757 corp: 26/492b lim: 40 exec/s: 28 rss: 75Mb L: 17/37 MS: 1 CrossOver- 00:09:04.728 [2024-11-20 15:10:43.209952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.209977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.210037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.210051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.728 #29 NEW cov: 12467 ft: 15819 corp: 27/509b lim: 40 exec/s: 29 rss: 75Mb L: 17/37 MS: 1 CopyPart- 00:09:04.728 [2024-11-20 15:10:43.270477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.270504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.270566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.270588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.270645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.270659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.270721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.270735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.728 #30 NEW cov: 12467 ft: 15827 corp: 28/545b lim: 40 exec/s: 30 rss: 75Mb L: 36/37 MS: 1 CopyPart- 00:09:04.728 [2024-11-20 15:10:43.330479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff7f cdw11:7f7f7f7f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.330505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.330567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7fffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.330581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.330639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.330652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.728 #31 NEW cov: 12467 ft: 15852 corp: 29/575b lim: 40 exec/s: 31 rss: 75Mb L: 30/37 MS: 1 InsertRepeatedBytes- 00:09:04.728 [2024-11-20 15:10:43.390666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.390692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.390752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.390770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.728 [2024-11-20 15:10:43.390829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.728 [2024-11-20 15:10:43.390843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.986 #32 NEW cov: 12467 ft: 15878 corp: 30/603b lim: 40 exec/s: 32 rss: 75Mb L: 28/37 MS: 1 CMP- DE: "\000\000"- 00:09:04.986 [2024-11-20 15:10:43.451001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.451026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.451089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.451103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.451166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.451183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.451245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0c0c0cff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.451259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.986 #33 NEW cov: 12467 ft: 15893 corp: 31/639b lim: 40 exec/s: 33 rss: 75Mb L: 36/37 MS: 1 ChangeBinInt- 00:09:04.986 [2024-11-20 15:10:43.491137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.491163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.491226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0c2a0c0c cdw11:0c0c0c3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.491240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.491299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.491318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.491378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0c0c0c0c cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.491392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.986 #34 NEW cov: 12467 ft: 15920 corp: 32/676b lim: 40 exec/s: 34 rss: 75Mb L: 37/37 MS: 1 ChangeByte- 00:09:04.986 [2024-11-20 15:10:43.550987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.551012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.551075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:26ffffff cdw11:ffffff79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.551089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.986 #35 NEW cov: 12467 ft: 15982 corp: 33/692b lim: 40 exec/s: 35 rss: 75Mb L: 16/37 MS: 1 CopyPart- 00:09:04.986 [2024-11-20 15:10:43.591219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff410000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.591244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.591307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.591332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.591395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.591409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.986 #36 NEW cov: 12467 ft: 15999 corp: 34/721b lim: 40 exec/s: 36 rss: 75Mb L: 29/37 MS: 1 InsertByte- 00:09:04.986 [2024-11-20 15:10:43.651210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.651239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.986 [2024-11-20 15:10:43.651302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fff72e79 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.986 [2024-11-20 15:10:43.651321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.243 #37 NEW cov: 12467 ft: 16039 corp: 35/737b lim: 40 exec/s: 18 rss: 75Mb L: 16/37 MS: 1 ChangeBit- 00:09:05.243 #37 DONE cov: 12467 ft: 16039 corp: 35/737b lim: 40 exec/s: 18 rss: 75Mb 00:09:05.243 ###### Recommended dictionary. ###### 00:09:05.243 "\000E$\232\255\214\270\274" # Uses: 0 00:09:05.243 "\000\000" # Uses: 0 00:09:05.243 ###### End of recommended dictionary. ###### 00:09:05.243 Done 37 runs in 2 second(s) 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:05.243 15:10:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:09:05.243 [2024-11-20 15:10:43.832872] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:05.243 [2024-11-20 15:10:43.832944] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1477483 ] 00:09:05.501 [2024-11-20 15:10:44.046517] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:05.501 [2024-11-20 15:10:44.060984] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:05.501 [2024-11-20 15:10:44.113763] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:05.501 [2024-11-20 15:10:44.130031] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:09:05.501 INFO: Running with entropic power schedule (0xFF, 100). 00:09:05.501 INFO: Seed: 2374121257 00:09:05.501 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:05.501 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:05.501 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:05.501 INFO: A corpus is not provided, starting from an empty corpus 00:09:05.501 #2 INITED exec/s: 0 rss: 66Mb 00:09:05.501 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:05.501 This may also happen if the target rejected all inputs we tried so far 00:09:05.501 [2024-11-20 15:10:44.178990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a004524 cdw11:9b4db47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.501 [2024-11-20 15:10:44.179020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.016 NEW_FUNC[1/715]: 0x469fe8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:09:06.016 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:06.016 #4 NEW cov: 12229 ft: 12224 corp: 2/10b lim: 40 exec/s: 0 rss: 73Mb L: 9/9 MS: 2 ChangeBit-CMP- DE: "\000E$\233M\264\177("- 00:09:06.016 [2024-11-20 15:10:44.519915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.016 [2024-11-20 15:10:44.519951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.016 [2024-11-20 15:10:44.520007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.016 [2024-11-20 15:10:44.520021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.016 NEW_FUNC[1/1]: 0x1ff2188 in msg_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:833 00:09:06.016 #5 NEW cov: 12351 ft: 13477 corp: 3/27b lim: 40 exec/s: 0 rss: 73Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:09:06.016 [2024-11-20 15:10:44.569930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ea0a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.016 [2024-11-20 15:10:44.569958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.016 [2024-11-20 15:10:44.570015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.016 [2024-11-20 15:10:44.570029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.016 #9 NEW cov: 12357 ft: 13751 corp: 4/43b lim: 40 exec/s: 0 rss: 73Mb L: 16/17 MS: 4 ChangeBit-ChangeBit-InsertByte-InsertRepeatedBytes- 00:09:06.016 [2024-11-20 15:10:44.610052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.016 [2024-11-20 15:10:44.610078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.016 [2024-11-20 15:10:44.610133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00008a00 cdw11:45249b4d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.016 [2024-11-20 15:10:44.610147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.016 #10 NEW cov: 12442 ft: 14020 corp: 5/62b lim: 40 exec/s: 0 rss: 74Mb L: 19/19 MS: 1 CrossOver- 00:09:06.016 [2024-11-20 15:10:44.670046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b4db47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.016 [2024-11-20 15:10:44.670076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.016 #11 NEW cov: 12442 ft: 14199 corp: 6/71b lim: 40 exec/s: 0 rss: 74Mb L: 9/19 MS: 1 PersAutoDict- DE: "\000E$\233M\264\177("- 00:09:06.275 [2024-11-20 15:10:44.710290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.710321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.275 [2024-11-20 15:10:44.710378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.710393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.275 #12 NEW cov: 12442 ft: 14274 corp: 7/88b lim: 40 exec/s: 0 rss: 74Mb L: 17/19 MS: 1 ShuffleBytes- 00:09:06.275 [2024-11-20 15:10:44.770345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b4fb47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.770372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.275 #13 NEW cov: 12442 ft: 14387 corp: 8/97b lim: 40 exec/s: 0 rss: 74Mb L: 9/19 MS: 1 ChangeBit- 00:09:06.275 [2024-11-20 15:10:44.830673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000045 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.830698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.275 [2024-11-20 15:10:44.830757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:249b0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.830771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.275 #14 NEW cov: 12442 ft: 14403 corp: 9/114b lim: 40 exec/s: 0 rss: 74Mb L: 17/19 MS: 1 CrossOver- 00:09:06.275 [2024-11-20 15:10:44.870871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b4fb47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.870897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.275 [2024-11-20 15:10:44.870969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.870984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.275 [2024-11-20 15:10:44.871042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:45249b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.871055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.275 #15 NEW cov: 12442 ft: 14646 corp: 10/140b lim: 40 exec/s: 0 rss: 74Mb L: 26/26 MS: 1 CrossOver- 00:09:06.275 [2024-11-20 15:10:44.930772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b4fb47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.275 [2024-11-20 15:10:44.930797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.275 #16 NEW cov: 12442 ft: 14692 corp: 11/149b lim: 40 exec/s: 0 rss: 74Mb L: 9/26 MS: 1 ChangeBit- 00:09:06.533 [2024-11-20 15:10:44.971041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.533 [2024-11-20 15:10:44.971069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.533 [2024-11-20 15:10:44.971128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:45249b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:44.971143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.534 #17 NEW cov: 12442 ft: 14801 corp: 12/167b lim: 40 exec/s: 0 rss: 74Mb L: 18/26 MS: 1 InsertByte- 00:09:06.534 [2024-11-20 15:10:45.031053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0045249b cdw11:4db47f28 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.031078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.534 #18 NEW cov: 12442 ft: 14822 corp: 13/176b lim: 40 exec/s: 0 rss: 74Mb L: 9/26 MS: 1 PersAutoDict- DE: "\000E$\233M\264\177("- 00:09:06.534 [2024-11-20 15:10:45.071642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.071666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.534 [2024-11-20 15:10:45.071740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.071754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.534 [2024-11-20 15:10:45.071809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.071823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.534 [2024-11-20 15:10:45.071876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.071890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.534 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:06.534 #19 NEW cov: 12465 ft: 15162 corp: 14/214b lim: 40 exec/s: 0 rss: 74Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:09:06.534 [2024-11-20 15:10:45.111473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:ac000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.111498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.534 [2024-11-20 15:10:45.111573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.111588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.534 #20 NEW cov: 12465 ft: 15189 corp: 15/231b lim: 40 exec/s: 0 rss: 74Mb L: 17/38 MS: 1 ChangeByte- 00:09:06.534 [2024-11-20 15:10:45.171626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.171651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.534 [2024-11-20 15:10:45.171709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.171723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.534 #21 NEW cov: 12465 ft: 15234 corp: 16/248b lim: 40 exec/s: 21 rss: 74Mb L: 17/38 MS: 1 ShuffleBytes- 00:09:06.534 [2024-11-20 15:10:45.211709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b002800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.211734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.534 [2024-11-20 15:10:45.211789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0045249b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.534 [2024-11-20 15:10:45.211803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.793 #22 NEW cov: 12465 ft: 15258 corp: 17/267b lim: 40 exec/s: 22 rss: 74Mb L: 19/38 MS: 1 EraseBytes- 00:09:06.793 [2024-11-20 15:10:45.272030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.272055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.272112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00008a00 cdw11:45249b4d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.272125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.272180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0045249b cdw11:4db47f28 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.272193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.793 #23 NEW cov: 12465 ft: 15276 corp: 18/294b lim: 40 exec/s: 23 rss: 74Mb L: 27/38 MS: 1 PersAutoDict- DE: "\000E$\233M\264\177("- 00:09:06.793 [2024-11-20 15:10:45.332048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.332073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.332129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.332143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.793 #24 NEW cov: 12465 ft: 15287 corp: 19/311b lim: 40 exec/s: 24 rss: 74Mb L: 17/38 MS: 1 CrossOver- 00:09:06.793 [2024-11-20 15:10:45.372451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.372475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.372551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.372565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.372620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.372633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.372687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.372704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.793 #27 NEW cov: 12465 ft: 15319 corp: 20/347b lim: 40 exec/s: 27 rss: 74Mb L: 36/38 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:09:06.793 [2024-11-20 15:10:45.412449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b4fb47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.412474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.412532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.412546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.412599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:45249b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.412612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.793 #28 NEW cov: 12465 ft: 15344 corp: 21/373b lim: 40 exec/s: 28 rss: 74Mb L: 26/38 MS: 1 ShuffleBytes- 00:09:06.793 [2024-11-20 15:10:45.452431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ea0a0000 cdw11:0000004d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.452457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.793 [2024-11-20 15:10:45.452516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.793 [2024-11-20 15:10:45.452530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.051 #29 NEW cov: 12465 ft: 15364 corp: 22/389b lim: 40 exec/s: 29 rss: 74Mb L: 16/38 MS: 1 ChangeByte- 00:09:07.051 [2024-11-20 15:10:45.512609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.512634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.051 [2024-11-20 15:10:45.512691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00008a00 cdw11:45249b4d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.512705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.051 #35 NEW cov: 12465 ft: 15373 corp: 23/408b lim: 40 exec/s: 35 rss: 74Mb L: 19/38 MS: 1 ShuffleBytes- 00:09:07.051 [2024-11-20 15:10:45.552989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b4fb47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.553014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.051 [2024-11-20 15:10:45.553073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.553086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.051 [2024-11-20 15:10:45.553139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:45240000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.553153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.051 [2024-11-20 15:10:45.553209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:009b0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.553222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:07.051 #36 NEW cov: 12465 ft: 15402 corp: 24/441b lim: 40 exec/s: 36 rss: 74Mb L: 33/38 MS: 1 CrossOver- 00:09:07.051 [2024-11-20 15:10:45.592935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.592960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.051 [2024-11-20 15:10:45.593018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:45249b00 cdw11:00004524 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.593031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.051 [2024-11-20 15:10:45.593099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9b4db47f cdw11:28000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.593113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.051 #37 NEW cov: 12465 ft: 15423 corp: 25/467b lim: 40 exec/s: 37 rss: 74Mb L: 26/38 MS: 1 PersAutoDict- DE: "\000E$\233M\264\177("- 00:09:07.051 [2024-11-20 15:10:45.652994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b002e00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.653019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.051 [2024-11-20 15:10:45.653072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0045249b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.653086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.051 #38 NEW cov: 12465 ft: 15543 corp: 26/486b lim: 40 exec/s: 38 rss: 75Mb L: 19/38 MS: 1 ChangeBinInt- 00:09:07.051 [2024-11-20 15:10:45.712983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0045249b cdw11:4d249b4d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.051 [2024-11-20 15:10:45.713008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.308 #39 NEW cov: 12465 ft: 15559 corp: 27/495b lim: 40 exec/s: 39 rss: 75Mb L: 9/38 MS: 1 CopyPart- 00:09:07.308 [2024-11-20 15:10:45.773269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0040003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.773294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.773368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:45249b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.773383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.308 #40 NEW cov: 12465 ft: 15592 corp: 28/513b lim: 40 exec/s: 40 rss: 75Mb L: 18/38 MS: 1 ChangeBit- 00:09:07.308 [2024-11-20 15:10:45.813771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b4fb47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.813797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.813857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0045249b cdw11:4db47f28 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.813872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.813927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0a000000 cdw11:00280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.813941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.813993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:45249b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.814007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:07.308 #41 NEW cov: 12465 ft: 15608 corp: 29/547b lim: 40 exec/s: 41 rss: 75Mb L: 34/38 MS: 1 PersAutoDict- DE: "\000E$\233M\264\177("- 00:09:07.308 [2024-11-20 15:10:45.853392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:06004524 cdw11:9b4db47f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.853419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.308 #45 NEW cov: 12465 ft: 15729 corp: 30/556b lim: 40 exec/s: 45 rss: 75Mb L: 9/38 MS: 4 CopyPart-ChangeBit-ChangeBit-PersAutoDict- DE: "\000E$\233M\264\177("- 00:09:07.308 [2024-11-20 15:10:45.893827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a454fb4 cdw11:9b24007f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.893853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.893925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.893939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.893993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:45249b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.894006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.308 #46 NEW cov: 12465 ft: 15734 corp: 31/582b lim: 40 exec/s: 46 rss: 75Mb L: 26/38 MS: 1 ShuffleBytes- 00:09:07.308 [2024-11-20 15:10:45.954152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.954178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.954233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.954247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.954301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.954320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.308 [2024-11-20 15:10:45.954373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.308 [2024-11-20 15:10:45.954386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:07.566 #47 NEW cov: 12465 ft: 15749 corp: 32/620b lim: 40 exec/s: 47 rss: 75Mb L: 38/38 MS: 1 ShuffleBytes- 00:09:07.566 [2024-11-20 15:10:46.014139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a454fb4 cdw11:9b24007f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.014165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.567 [2024-11-20 15:10:46.014220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00200000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.014233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.567 [2024-11-20 15:10:46.014284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:45249b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.014297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.567 #48 NEW cov: 12465 ft: 15761 corp: 33/646b lim: 40 exec/s: 48 rss: 75Mb L: 26/38 MS: 1 ChangeBinInt- 00:09:07.567 [2024-11-20 15:10:46.074126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.074152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.567 [2024-11-20 15:10:46.074209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00008a00 cdw11:9b4d24b4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.074223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.567 #49 NEW cov: 12465 ft: 15780 corp: 34/665b lim: 40 exec/s: 49 rss: 75Mb L: 19/38 MS: 1 ShuffleBytes- 00:09:07.567 [2024-11-20 15:10:46.114242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a004524 cdw11:9b4fb400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.114268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.567 [2024-11-20 15:10:46.114347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:45249b4d cdw11:b47f287f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.114363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.567 #50 NEW cov: 12465 ft: 15789 corp: 35/682b lim: 40 exec/s: 50 rss: 75Mb L: 17/38 MS: 1 PersAutoDict- DE: "\000E$\233M\264\177("- 00:09:07.567 [2024-11-20 15:10:46.174695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.174721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.567 [2024-11-20 15:10:46.174771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:45249b00 cdw11:00004524 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.174785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.567 [2024-11-20 15:10:46.174836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9b4db47f cdw11:28000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.174850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.567 [2024-11-20 15:10:46.174901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0000d2d2 cdw11:d2d2d2d2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.567 [2024-11-20 15:10:46.174917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:07.567 #51 NEW cov: 12465 ft: 15812 corp: 36/714b lim: 40 exec/s: 25 rss: 75Mb L: 32/38 MS: 1 InsertRepeatedBytes- 00:09:07.567 #51 DONE cov: 12465 ft: 15812 corp: 36/714b lim: 40 exec/s: 25 rss: 75Mb 00:09:07.567 ###### Recommended dictionary. ###### 00:09:07.567 "\000E$\233M\264\177(" # Uses: 7 00:09:07.567 ###### End of recommended dictionary. ###### 00:09:07.567 Done 51 runs in 2 second(s) 00:09:07.824 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:07.825 15:10:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:09:07.825 [2024-11-20 15:10:46.359732] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:07.825 [2024-11-20 15:10:46.359816] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1477836 ] 00:09:08.081 [2024-11-20 15:10:46.574597] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.081 [2024-11-20 15:10:46.589374] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.081 [2024-11-20 15:10:46.642127] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:08.081 [2024-11-20 15:10:46.658375] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:09:08.081 INFO: Running with entropic power schedule (0xFF, 100). 00:09:08.081 INFO: Seed: 609159038 00:09:08.081 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:08.081 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:08.081 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:08.081 INFO: A corpus is not provided, starting from an empty corpus 00:09:08.081 #2 INITED exec/s: 0 rss: 66Mb 00:09:08.081 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:08.081 This may also happen if the target rejected all inputs we tried so far 00:09:08.081 [2024-11-20 15:10:46.713735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.081 [2024-11-20 15:10:46.713765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.647 NEW_FUNC[1/715]: 0x46bbb8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:09:08.647 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:08.647 #21 NEW cov: 12226 ft: 12224 corp: 2/12b lim: 40 exec/s: 0 rss: 73Mb L: 11/11 MS: 4 InsertByte-EraseBytes-CrossOver-InsertRepeatedBytes- 00:09:08.647 [2024-11-20 15:10:47.054702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.647 [2024-11-20 15:10:47.054740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.647 #27 NEW cov: 12339 ft: 12934 corp: 3/23b lim: 40 exec/s: 0 rss: 73Mb L: 11/11 MS: 1 CrossOver- 00:09:08.647 [2024-11-20 15:10:47.114815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:f5ff0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.647 [2024-11-20 15:10:47.114842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.647 #28 NEW cov: 12345 ft: 13171 corp: 4/34b lim: 40 exec/s: 0 rss: 73Mb L: 11/11 MS: 1 ChangeByte- 00:09:08.647 [2024-11-20 15:10:47.174941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.647 [2024-11-20 15:10:47.174968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.647 #29 NEW cov: 12430 ft: 13476 corp: 5/45b lim: 40 exec/s: 0 rss: 73Mb L: 11/11 MS: 1 ChangeByte- 00:09:08.647 [2024-11-20 15:10:47.215019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.647 [2024-11-20 15:10:47.215046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.647 #35 NEW cov: 12430 ft: 13674 corp: 6/56b lim: 40 exec/s: 0 rss: 74Mb L: 11/11 MS: 1 ChangeBit- 00:09:08.647 [2024-11-20 15:10:47.275201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ff2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.647 [2024-11-20 15:10:47.275227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.647 #36 NEW cov: 12430 ft: 13784 corp: 7/71b lim: 40 exec/s: 0 rss: 74Mb L: 15/15 MS: 1 CopyPart- 00:09:08.647 [2024-11-20 15:10:47.315294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.647 [2024-11-20 15:10:47.315324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.906 #37 NEW cov: 12430 ft: 13847 corp: 8/79b lim: 40 exec/s: 0 rss: 74Mb L: 8/15 MS: 1 EraseBytes- 00:09:08.906 [2024-11-20 15:10:47.355568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:fffff7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.906 [2024-11-20 15:10:47.355594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.906 [2024-11-20 15:10:47.355657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0aff2bff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.906 [2024-11-20 15:10:47.355674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.906 #38 NEW cov: 12430 ft: 14184 corp: 9/96b lim: 40 exec/s: 0 rss: 74Mb L: 17/17 MS: 1 CrossOver- 00:09:08.906 [2024-11-20 15:10:47.395677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.906 [2024-11-20 15:10:47.395703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.906 [2024-11-20 15:10:47.395761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff0aff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.906 [2024-11-20 15:10:47.395775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.906 #39 NEW cov: 12430 ft: 14205 corp: 10/118b lim: 40 exec/s: 0 rss: 74Mb L: 22/22 MS: 1 CopyPart- 00:09:08.906 [2024-11-20 15:10:47.455708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affff0a cdw11:ff2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.906 [2024-11-20 15:10:47.455734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.906 #40 NEW cov: 12430 ft: 14308 corp: 11/130b lim: 40 exec/s: 0 rss: 74Mb L: 12/22 MS: 1 CrossOver- 00:09:08.906 [2024-11-20 15:10:47.495838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ff2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.906 [2024-11-20 15:10:47.495864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.906 #41 NEW cov: 12430 ft: 14356 corp: 12/145b lim: 40 exec/s: 0 rss: 74Mb L: 15/22 MS: 1 ChangeBinInt- 00:09:08.906 [2024-11-20 15:10:47.555996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.906 [2024-11-20 15:10:47.556021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.906 #42 NEW cov: 12430 ft: 14414 corp: 13/156b lim: 40 exec/s: 0 rss: 74Mb L: 11/22 MS: 1 CopyPart- 00:09:09.164 [2024-11-20 15:10:47.596133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.164 [2024-11-20 15:10:47.596157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.164 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:09.164 #43 NEW cov: 12453 ft: 14492 corp: 14/166b lim: 40 exec/s: 0 rss: 74Mb L: 10/22 MS: 1 InsertRepeatedBytes- 00:09:09.164 [2024-11-20 15:10:47.636333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.164 [2024-11-20 15:10:47.636358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.164 [2024-11-20 15:10:47.636437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff2bff cdw11:ffff2bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.164 [2024-11-20 15:10:47.636452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.164 #44 NEW cov: 12453 ft: 14516 corp: 15/185b lim: 40 exec/s: 0 rss: 74Mb L: 19/22 MS: 1 CrossOver- 00:09:09.164 [2024-11-20 15:10:47.676346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:2bfff5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.164 [2024-11-20 15:10:47.676374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.164 #45 NEW cov: 12453 ft: 14543 corp: 16/196b lim: 40 exec/s: 45 rss: 74Mb L: 11/22 MS: 1 CMP- DE: "\365\377\377\377"- 00:09:09.165 [2024-11-20 15:10:47.736501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.165 [2024-11-20 15:10:47.736526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.165 #46 NEW cov: 12453 ft: 14563 corp: 17/209b lim: 40 exec/s: 46 rss: 74Mb L: 13/22 MS: 1 CrossOver- 00:09:09.165 [2024-11-20 15:10:47.776627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affef0a cdw11:ff2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.165 [2024-11-20 15:10:47.776652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.165 #47 NEW cov: 12453 ft: 14614 corp: 18/221b lim: 40 exec/s: 47 rss: 74Mb L: 12/22 MS: 1 ChangeBit- 00:09:09.165 [2024-11-20 15:10:47.836925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:fff5ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.165 [2024-11-20 15:10:47.836950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.165 [2024-11-20 15:10:47.837009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff2bffff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.165 [2024-11-20 15:10:47.837024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.424 #48 NEW cov: 12453 ft: 14637 corp: 19/240b lim: 40 exec/s: 48 rss: 74Mb L: 19/22 MS: 1 PersAutoDict- DE: "\365\377\377\377"- 00:09:09.424 [2024-11-20 15:10:47.876917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fffd0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.424 [2024-11-20 15:10:47.876942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.424 #49 NEW cov: 12453 ft: 14692 corp: 20/251b lim: 40 exec/s: 49 rss: 74Mb L: 11/22 MS: 1 ChangeBit- 00:09:09.424 [2024-11-20 15:10:47.917035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffe50a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.424 [2024-11-20 15:10:47.917060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.424 #51 NEW cov: 12453 ft: 14752 corp: 21/259b lim: 40 exec/s: 51 rss: 74Mb L: 8/22 MS: 2 EraseBytes-InsertByte- 00:09:09.424 [2024-11-20 15:10:47.977170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:dfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.424 [2024-11-20 15:10:47.977195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.424 #52 NEW cov: 12453 ft: 14765 corp: 22/270b lim: 40 exec/s: 52 rss: 74Mb L: 11/22 MS: 1 ChangeBit- 00:09:09.424 [2024-11-20 15:10:48.017326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffe50a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.424 [2024-11-20 15:10:48.017352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.424 #53 NEW cov: 12453 ft: 14776 corp: 23/278b lim: 40 exec/s: 53 rss: 74Mb L: 8/22 MS: 1 CopyPart- 00:09:09.424 [2024-11-20 15:10:48.077623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff0aff2b cdw11:ffff2bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.424 [2024-11-20 15:10:48.077647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.424 [2024-11-20 15:10:48.077709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff0aff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.424 [2024-11-20 15:10:48.077722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.683 #54 NEW cov: 12453 ft: 14823 corp: 24/299b lim: 40 exec/s: 54 rss: 74Mb L: 21/22 MS: 1 CrossOver- 00:09:09.683 [2024-11-20 15:10:48.137656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff2afe0a cdw11:f5ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.137682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.683 #58 NEW cov: 12453 ft: 14890 corp: 25/307b lim: 40 exec/s: 58 rss: 74Mb L: 8/22 MS: 4 EraseBytes-ChangeByte-PersAutoDict-InsertByte- DE: "\365\377\377\377"- 00:09:09.683 [2024-11-20 15:10:48.177814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.177839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.683 #59 NEW cov: 12453 ft: 14907 corp: 26/319b lim: 40 exec/s: 59 rss: 74Mb L: 12/22 MS: 1 InsertByte- 00:09:09.683 [2024-11-20 15:10:48.218323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.218348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.683 [2024-11-20 15:10:48.218406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.218420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.683 [2024-11-20 15:10:48.218478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.218492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.683 [2024-11-20 15:10:48.218552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:0aff2bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.218565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:09.683 #60 NEW cov: 12453 ft: 15449 corp: 27/356b lim: 40 exec/s: 60 rss: 74Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:09:09.683 [2024-11-20 15:10:48.258012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:f5ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.258037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.683 #61 NEW cov: 12453 ft: 15519 corp: 28/367b lim: 40 exec/s: 61 rss: 74Mb L: 11/37 MS: 1 PersAutoDict- DE: "\365\377\377\377"- 00:09:09.683 [2024-11-20 15:10:48.298257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:f6fff7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.298281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.683 [2024-11-20 15:10:48.298345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0aff2bff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.683 [2024-11-20 15:10:48.298363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.683 #62 NEW cov: 12453 ft: 15530 corp: 29/384b lim: 40 exec/s: 62 rss: 74Mb L: 17/37 MS: 1 ChangeBinInt- 00:09:09.684 [2024-11-20 15:10:48.358285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffedffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.684 [2024-11-20 15:10:48.358310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.942 #63 NEW cov: 12453 ft: 15534 corp: 30/392b lim: 40 exec/s: 63 rss: 74Mb L: 8/37 MS: 1 ChangeByte- 00:09:09.942 [2024-11-20 15:10:48.418505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.942 [2024-11-20 15:10:48.418530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.942 #64 NEW cov: 12453 ft: 15567 corp: 31/400b lim: 40 exec/s: 64 rss: 74Mb L: 8/37 MS: 1 EraseBytes- 00:09:09.942 [2024-11-20 15:10:48.458601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.942 [2024-11-20 15:10:48.458626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.942 #65 NEW cov: 12453 ft: 15579 corp: 32/408b lim: 40 exec/s: 65 rss: 74Mb L: 8/37 MS: 1 EraseBytes- 00:09:09.942 [2024-11-20 15:10:48.498719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fffff80a cdw11:ff2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.942 [2024-11-20 15:10:48.498746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.942 #66 NEW cov: 12453 ft: 15586 corp: 33/423b lim: 40 exec/s: 66 rss: 74Mb L: 15/37 MS: 1 ChangeBinInt- 00:09:09.942 [2024-11-20 15:10:48.538928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.942 [2024-11-20 15:10:48.538955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.942 [2024-11-20 15:10:48.539016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff2bff cdw11:ffff2bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.942 [2024-11-20 15:10:48.539030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.942 [2024-11-20 15:10:48.599108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.942 [2024-11-20 15:10:48.599134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.943 [2024-11-20 15:10:48.599194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff2bff cdw11:ffff2b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:09.943 [2024-11-20 15:10:48.599208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.943 #68 NEW cov: 12453 ft: 15591 corp: 34/443b lim: 40 exec/s: 68 rss: 75Mb L: 20/37 MS: 2 InsertByte-ChangeBinInt- 00:09:10.202 [2024-11-20 15:10:48.639080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff2bff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.202 [2024-11-20 15:10:48.639106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.202 #69 NEW cov: 12453 ft: 15601 corp: 35/452b lim: 40 exec/s: 69 rss: 75Mb L: 9/37 MS: 1 EraseBytes- 00:09:10.202 [2024-11-20 15:10:48.679341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:fffff7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.202 [2024-11-20 15:10:48.679370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.202 [2024-11-20 15:10:48.679428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0afff5ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.202 [2024-11-20 15:10:48.679442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.202 #70 NEW cov: 12453 ft: 15629 corp: 36/469b lim: 40 exec/s: 35 rss: 75Mb L: 17/37 MS: 1 PersAutoDict- DE: "\365\377\377\377"- 00:09:10.202 #70 DONE cov: 12453 ft: 15629 corp: 36/469b lim: 40 exec/s: 35 rss: 75Mb 00:09:10.202 ###### Recommended dictionary. ###### 00:09:10.202 "\365\377\377\377" # Uses: 4 00:09:10.202 ###### End of recommended dictionary. ###### 00:09:10.202 Done 70 runs in 2 second(s) 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:10.202 15:10:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:09:10.202 [2024-11-20 15:10:48.839411] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:10.202 [2024-11-20 15:10:48.839486] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1478190 ] 00:09:10.461 [2024-11-20 15:10:49.055985] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:10.461 [2024-11-20 15:10:49.070768] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.461 [2024-11-20 15:10:49.123548] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:10.461 [2024-11-20 15:10:49.139772] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:09:10.721 INFO: Running with entropic power schedule (0xFF, 100). 00:09:10.721 INFO: Seed: 3089175023 00:09:10.721 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:10.721 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:10.721 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:10.721 INFO: A corpus is not provided, starting from an empty corpus 00:09:10.721 #2 INITED exec/s: 0 rss: 66Mb 00:09:10.721 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:10.721 This may also happen if the target rejected all inputs we tried so far 00:09:10.721 [2024-11-20 15:10:49.188883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.721 [2024-11-20 15:10:49.188913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.980 NEW_FUNC[1/717]: 0x46d788 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:09:10.980 NEW_FUNC[2/717]: 0x48ecd8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:09:10.980 #9 NEW cov: 12227 ft: 12221 corp: 2/12b lim: 35 exec/s: 0 rss: 73Mb L: 11/11 MS: 2 CMP-InsertRepeatedBytes- DE: "\003\000"- 00:09:10.980 [2024-11-20 15:10:49.529831] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.980 [2024-11-20 15:10:49.529869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.980 #10 NEW cov: 12340 ft: 12718 corp: 3/23b lim: 35 exec/s: 0 rss: 73Mb L: 11/11 MS: 1 PersAutoDict- DE: "\003\000"- 00:09:10.980 [2024-11-20 15:10:49.590042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.980 [2024-11-20 15:10:49.590072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.981 [2024-11-20 15:10:49.590133] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.981 [2024-11-20 15:10:49.590149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.981 #11 NEW cov: 12349 ft: 13659 corp: 4/41b lim: 35 exec/s: 0 rss: 74Mb L: 18/18 MS: 1 CrossOver- 00:09:10.981 [2024-11-20 15:10:49.629996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.981 [2024-11-20 15:10:49.630024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.240 #12 NEW cov: 12434 ft: 14063 corp: 5/52b lim: 35 exec/s: 0 rss: 74Mb L: 11/18 MS: 1 ChangeBit- 00:09:11.240 [2024-11-20 15:10:49.690129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.690157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.240 #13 NEW cov: 12434 ft: 14177 corp: 6/63b lim: 35 exec/s: 0 rss: 74Mb L: 11/18 MS: 1 ChangeBit- 00:09:11.240 [2024-11-20 15:10:49.730590] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.730615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.240 [2024-11-20 15:10:49.730694] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.730708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.240 [2024-11-20 15:10:49.730772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.730786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.240 #14 NEW cov: 12441 ft: 14558 corp: 7/86b lim: 35 exec/s: 0 rss: 74Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:09:11.240 [2024-11-20 15:10:49.770519] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.770547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.240 [2024-11-20 15:10:49.770609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.770626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.240 #15 NEW cov: 12441 ft: 14633 corp: 8/104b lim: 35 exec/s: 0 rss: 74Mb L: 18/23 MS: 1 ChangeBit- 00:09:11.240 [2024-11-20 15:10:49.830687] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.830714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.240 [2024-11-20 15:10:49.830793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.830812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.240 #16 NEW cov: 12441 ft: 14682 corp: 9/122b lim: 35 exec/s: 0 rss: 74Mb L: 18/23 MS: 1 ChangeBinInt- 00:09:11.240 [2024-11-20 15:10:49.891014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.891041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.240 [2024-11-20 15:10:49.891104] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.891121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.240 [2024-11-20 15:10:49.891178] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES LBA RANGE TYPE cid:6 cdw10:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.240 [2024-11-20 15:10:49.891192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.240 NEW_FUNC[1/1]: 0x489f18 in feat_lba_range_type /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:289 00:09:11.240 #17 NEW cov: 12452 ft: 14742 corp: 10/143b lim: 35 exec/s: 0 rss: 74Mb L: 21/23 MS: 1 CopyPart- 00:09:11.498 [2024-11-20 15:10:49.930821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.498 [2024-11-20 15:10:49.930849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.498 #18 NEW cov: 12452 ft: 14850 corp: 11/151b lim: 35 exec/s: 0 rss: 74Mb L: 8/23 MS: 1 EraseBytes- 00:09:11.498 [2024-11-20 15:10:49.991100] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.498 [2024-11-20 15:10:49.991128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.498 [2024-11-20 15:10:49.991207] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.498 [2024-11-20 15:10:49.991225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.498 #19 NEW cov: 12452 ft: 14888 corp: 12/166b lim: 35 exec/s: 0 rss: 74Mb L: 15/23 MS: 1 CopyPart- 00:09:11.498 NEW_FUNC[1/1]: 0x13a7dd8 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1766 00:09:11.498 #21 NEW cov: 12475 ft: 14979 corp: 13/173b lim: 35 exec/s: 0 rss: 74Mb L: 7/23 MS: 2 EraseBytes-InsertByte- 00:09:11.498 [2024-11-20 15:10:50.101484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.498 [2024-11-20 15:10:50.101526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.498 [2024-11-20 15:10:50.101588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.498 [2024-11-20 15:10:50.101603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.498 #22 NEW cov: 12475 ft: 15012 corp: 14/191b lim: 35 exec/s: 0 rss: 74Mb L: 18/23 MS: 1 ChangeBinInt- 00:09:11.498 [2024-11-20 15:10:50.161450] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.498 [2024-11-20 15:10:50.161482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.756 #23 NEW cov: 12475 ft: 15028 corp: 15/202b lim: 35 exec/s: 23 rss: 74Mb L: 11/23 MS: 1 EraseBytes- 00:09:11.756 [2024-11-20 15:10:50.221935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.756 [2024-11-20 15:10:50.221962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.756 [2024-11-20 15:10:50.222026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.756 [2024-11-20 15:10:50.222040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.756 [2024-11-20 15:10:50.222101] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.756 [2024-11-20 15:10:50.222115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.756 #24 NEW cov: 12475 ft: 15031 corp: 16/225b lim: 35 exec/s: 24 rss: 74Mb L: 23/23 MS: 1 ChangeByte- 00:09:11.756 [2024-11-20 15:10:50.281785] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.756 [2024-11-20 15:10:50.281814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.756 #25 NEW cov: 12475 ft: 15055 corp: 17/232b lim: 35 exec/s: 25 rss: 75Mb L: 7/23 MS: 1 ChangeBinInt- 00:09:11.756 [2024-11-20 15:10:50.341934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.756 [2024-11-20 15:10:50.341961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.756 #26 NEW cov: 12475 ft: 15062 corp: 18/243b lim: 35 exec/s: 26 rss: 75Mb L: 11/23 MS: 1 ChangeBit- 00:09:11.756 [2024-11-20 15:10:50.382208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.756 [2024-11-20 15:10:50.382233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.756 [2024-11-20 15:10:50.382296] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.756 [2024-11-20 15:10:50.382336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.756 #27 NEW cov: 12475 ft: 15080 corp: 19/262b lim: 35 exec/s: 27 rss: 75Mb L: 19/23 MS: 1 InsertByte- 00:09:11.757 [2024-11-20 15:10:50.422336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.757 [2024-11-20 15:10:50.422363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.757 [2024-11-20 15:10:50.422425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.757 [2024-11-20 15:10:50.422442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.015 #28 NEW cov: 12475 ft: 15107 corp: 20/281b lim: 35 exec/s: 28 rss: 75Mb L: 19/23 MS: 1 CopyPart- 00:09:12.015 [2024-11-20 15:10:50.482303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.015 [2024-11-20 15:10:50.482334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.015 #29 NEW cov: 12475 ft: 15127 corp: 21/292b lim: 35 exec/s: 29 rss: 75Mb L: 11/23 MS: 1 CopyPart- 00:09:12.015 [2024-11-20 15:10:50.542449] ctrlr.c:1627:nvmf_ctrlr_set_features_power_management: *ERROR*: Invalid power state 31 00:09:12.015 [2024-11-20 15:10:50.542678] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.015 [2024-11-20 15:10:50.542706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.015 [2024-11-20 15:10:50.542765] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES POWER MANAGEMENT cid:5 cdw10:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.015 [2024-11-20 15:10:50.542779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.015 NEW_FUNC[1/2]: 0x4894e8 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:09:12.015 NEW_FUNC[2/2]: 0x13a1268 in nvmf_ctrlr_set_features_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1618 00:09:12.015 #30 NEW cov: 12526 ft: 15194 corp: 22/311b lim: 35 exec/s: 30 rss: 75Mb L: 19/23 MS: 1 ChangeBinInt- 00:09:12.015 [2024-11-20 15:10:50.602680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.015 [2024-11-20 15:10:50.602708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.015 #31 NEW cov: 12526 ft: 15286 corp: 23/323b lim: 35 exec/s: 31 rss: 75Mb L: 12/23 MS: 1 InsertByte- 00:09:12.015 #32 NEW cov: 12526 ft: 15310 corp: 24/330b lim: 35 exec/s: 32 rss: 75Mb L: 7/23 MS: 1 PersAutoDict- DE: "\003\000"- 00:09:12.274 #33 NEW cov: 12526 ft: 15347 corp: 25/337b lim: 35 exec/s: 33 rss: 75Mb L: 7/23 MS: 1 ChangeBit- 00:09:12.274 [2024-11-20 15:10:50.723155] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.274 [2024-11-20 15:10:50.723182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.274 [2024-11-20 15:10:50.723238] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.274 [2024-11-20 15:10:50.723254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.274 #34 NEW cov: 12526 ft: 15361 corp: 26/356b lim: 35 exec/s: 34 rss: 75Mb L: 19/23 MS: 1 PersAutoDict- DE: "\003\000"- 00:09:12.274 [2024-11-20 15:10:50.763300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.274 [2024-11-20 15:10:50.763332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.274 [2024-11-20 15:10:50.763392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.274 [2024-11-20 15:10:50.763409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.274 #35 NEW cov: 12526 ft: 15456 corp: 27/376b lim: 35 exec/s: 35 rss: 75Mb L: 20/23 MS: 1 InsertByte- 00:09:12.274 [2024-11-20 15:10:50.803253] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.274 [2024-11-20 15:10:50.803279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.274 #36 NEW cov: 12526 ft: 15484 corp: 28/387b lim: 35 exec/s: 36 rss: 75Mb L: 11/23 MS: 1 ShuffleBytes- 00:09:12.274 [2024-11-20 15:10:50.863572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.274 [2024-11-20 15:10:50.863599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.274 [2024-11-20 15:10:50.863661] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.275 [2024-11-20 15:10:50.863675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.275 #37 NEW cov: 12526 ft: 15508 corp: 29/406b lim: 35 exec/s: 37 rss: 75Mb L: 19/23 MS: 1 ChangeByte- 00:09:12.275 [2024-11-20 15:10:50.904007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.275 [2024-11-20 15:10:50.904034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.275 [2024-11-20 15:10:50.904094] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.275 [2024-11-20 15:10:50.904112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.275 [2024-11-20 15:10:50.904169] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ba SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.275 [2024-11-20 15:10:50.904186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:12.275 [2024-11-20 15:10:50.904244] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000003b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.275 [2024-11-20 15:10:50.904259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:12.275 #38 NEW cov: 12526 ft: 15799 corp: 30/438b lim: 35 exec/s: 38 rss: 75Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:09:12.533 [2024-11-20 15:10:50.964195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:50.964223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:50.964300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:50.964318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:50.964385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:50.964399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:50.964453] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES POWER MANAGEMENT cid:7 cdw10:80000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:50.964468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:12.533 #39 NEW cov: 12526 ft: 15814 corp: 31/468b lim: 35 exec/s: 39 rss: 75Mb L: 30/32 MS: 1 InsertRepeatedBytes- 00:09:12.533 [2024-11-20 15:10:51.024209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.024234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:51.024295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.024309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:51.024392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:6 cdw10:00000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.024407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:12.533 #40 NEW cov: 12526 ft: 15823 corp: 32/491b lim: 35 exec/s: 40 rss: 75Mb L: 23/32 MS: 1 ChangeBit- 00:09:12.533 [2024-11-20 15:10:51.084322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.084366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:51.084427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.084441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:51.084500] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.084513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:12.533 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:12.533 #41 NEW cov: 12549 ft: 15864 corp: 33/518b lim: 35 exec/s: 41 rss: 75Mb L: 27/32 MS: 1 EraseBytes- 00:09:12.533 [2024-11-20 15:10:51.144710] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.144735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:51.144795] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.144809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:51.144867] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.144881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:51.144935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.144952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:12.533 #42 NEW cov: 12549 ft: 15925 corp: 34/551b lim: 35 exec/s: 42 rss: 75Mb L: 33/33 MS: 1 CopyPart- 00:09:12.533 [2024-11-20 15:10:51.184465] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.184491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.533 [2024-11-20 15:10:51.184550] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.533 [2024-11-20 15:10:51.184566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.791 #43 NEW cov: 12549 ft: 15963 corp: 35/570b lim: 35 exec/s: 21 rss: 75Mb L: 19/33 MS: 1 PersAutoDict- DE: "\003\000"- 00:09:12.791 #43 DONE cov: 12549 ft: 15963 corp: 35/570b lim: 35 exec/s: 21 rss: 75Mb 00:09:12.791 ###### Recommended dictionary. ###### 00:09:12.791 "\003\000" # Uses: 4 00:09:12.791 ###### End of recommended dictionary. ###### 00:09:12.791 Done 43 runs in 2 second(s) 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:12.791 15:10:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:09:12.791 [2024-11-20 15:10:51.367740] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:12.791 [2024-11-20 15:10:51.367812] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1478471 ] 00:09:13.050 [2024-11-20 15:10:51.585734] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:13.050 [2024-11-20 15:10:51.600306] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:13.050 [2024-11-20 15:10:51.653054] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:13.050 [2024-11-20 15:10:51.669289] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:09:13.050 INFO: Running with entropic power schedule (0xFF, 100). 00:09:13.050 INFO: Seed: 1324192117 00:09:13.050 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:13.050 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:13.050 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:13.050 INFO: A corpus is not provided, starting from an empty corpus 00:09:13.050 #2 INITED exec/s: 0 rss: 66Mb 00:09:13.050 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:13.050 This may also happen if the target rejected all inputs we tried so far 00:09:13.050 [2024-11-20 15:10:51.715022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.050 [2024-11-20 15:10:51.715051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.050 [2024-11-20 15:10:51.715111] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.050 [2024-11-20 15:10:51.715125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.569 NEW_FUNC[1/715]: 0x46ecc8 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:09:13.569 NEW_FUNC[2/715]: 0x4894e8 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:09:13.569 #16 NEW cov: 12210 ft: 12228 corp: 2/26b lim: 35 exec/s: 0 rss: 73Mb L: 25/25 MS: 4 ChangeBit-InsertRepeatedBytes-CrossOver-InsertRepeatedBytes- 00:09:13.569 [2024-11-20 15:10:52.055721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.569 [2024-11-20 15:10:52.055757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.569 NEW_FUNC[1/1]: 0x1fa35b8 in thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1080 00:09:13.569 #22 NEW cov: 12344 ft: 12952 corp: 3/41b lim: 35 exec/s: 0 rss: 73Mb L: 15/25 MS: 1 EraseBytes- 00:09:13.569 [2024-11-20 15:10:52.125906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.569 [2024-11-20 15:10:52.125934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.569 [2024-11-20 15:10:52.125991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.569 [2024-11-20 15:10:52.126005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.569 [2024-11-20 15:10:52.126065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.569 [2024-11-20 15:10:52.126079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.569 #26 NEW cov: 12350 ft: 13205 corp: 4/63b lim: 35 exec/s: 0 rss: 73Mb L: 22/25 MS: 4 CopyPart-CrossOver-CopyPart-InsertRepeatedBytes- 00:09:13.569 [2024-11-20 15:10:52.165961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.569 [2024-11-20 15:10:52.165986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.569 #27 NEW cov: 12435 ft: 13504 corp: 5/78b lim: 35 exec/s: 0 rss: 74Mb L: 15/25 MS: 1 ShuffleBytes- 00:09:13.569 [2024-11-20 15:10:52.226161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.569 [2024-11-20 15:10:52.226186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.569 [2024-11-20 15:10:52.226243] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.569 [2024-11-20 15:10:52.226258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.569 [2024-11-20 15:10:52.226320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.569 [2024-11-20 15:10:52.226334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.828 #28 NEW cov: 12435 ft: 13727 corp: 6/100b lim: 35 exec/s: 0 rss: 74Mb L: 22/25 MS: 1 CopyPart- 00:09:13.828 [2024-11-20 15:10:52.286406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.286432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.828 [2024-11-20 15:10:52.286492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.286506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.828 #29 NEW cov: 12435 ft: 13841 corp: 7/125b lim: 35 exec/s: 0 rss: 74Mb L: 25/25 MS: 1 CopyPart- 00:09:13.828 [2024-11-20 15:10:52.326508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.326534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.828 [2024-11-20 15:10:52.326593] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.326607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.828 #30 NEW cov: 12435 ft: 13915 corp: 8/150b lim: 35 exec/s: 0 rss: 74Mb L: 25/25 MS: 1 CMP- DE: "\001\037"- 00:09:13.828 [2024-11-20 15:10:52.386693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.386718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.828 [2024-11-20 15:10:52.386777] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.386791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.828 #36 NEW cov: 12435 ft: 14029 corp: 9/175b lim: 35 exec/s: 0 rss: 74Mb L: 25/25 MS: 1 ChangeByte- 00:09:13.828 [2024-11-20 15:10:52.426931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.426957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.828 [2024-11-20 15:10:52.427015] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.427029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.828 [2024-11-20 15:10:52.427085] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.427104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:13.828 #37 NEW cov: 12435 ft: 14377 corp: 10/207b lim: 35 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:09:13.828 [2024-11-20 15:10:52.486985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.487009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.828 [2024-11-20 15:10:52.487088] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:13.828 [2024-11-20 15:10:52.487104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.828 #38 NEW cov: 12435 ft: 14444 corp: 11/232b lim: 35 exec/s: 0 rss: 74Mb L: 25/32 MS: 1 CrossOver- 00:09:14.087 [2024-11-20 15:10:52.527069] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.527095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.087 [2024-11-20 15:10:52.527156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000071f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.527171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.087 #39 NEW cov: 12435 ft: 14488 corp: 12/257b lim: 35 exec/s: 0 rss: 74Mb L: 25/32 MS: 1 PersAutoDict- DE: "\001\037"- 00:09:14.087 [2024-11-20 15:10:52.567181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.567207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.087 [2024-11-20 15:10:52.567265] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.567279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.087 #40 NEW cov: 12435 ft: 14522 corp: 13/282b lim: 35 exec/s: 0 rss: 74Mb L: 25/32 MS: 1 ChangeByte- 00:09:14.087 [2024-11-20 15:10:52.607340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.607366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.087 NEW_FUNC[1/2]: 0x48b638 in feat_error_recover /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:304 00:09:14.087 NEW_FUNC[2/2]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:14.087 #41 NEW cov: 12485 ft: 14663 corp: 14/307b lim: 35 exec/s: 0 rss: 74Mb L: 25/32 MS: 1 ChangeBinInt- 00:09:14.087 [2024-11-20 15:10:52.667449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.667475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.087 [2024-11-20 15:10:52.667538] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.667553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.087 [2024-11-20 15:10:52.667614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.667628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.087 #42 NEW cov: 12485 ft: 14676 corp: 15/329b lim: 35 exec/s: 42 rss: 74Mb L: 22/32 MS: 1 ChangeBinInt- 00:09:14.087 [2024-11-20 15:10:52.727734] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.727760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.087 [2024-11-20 15:10:52.727822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.727836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.087 [2024-11-20 15:10:52.727896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.727909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.087 [2024-11-20 15:10:52.727965] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.087 [2024-11-20 15:10:52.727979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.087 #43 NEW cov: 12485 ft: 14865 corp: 16/357b lim: 35 exec/s: 43 rss: 74Mb L: 28/32 MS: 1 CrossOver- 00:09:14.346 [2024-11-20 15:10:52.787891] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.787917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.346 [2024-11-20 15:10:52.787977] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.787991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.346 [2024-11-20 15:10:52.788049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.788063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.346 [2024-11-20 15:10:52.788121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.788135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.346 #44 NEW cov: 12485 ft: 14887 corp: 17/385b lim: 35 exec/s: 44 rss: 74Mb L: 28/32 MS: 1 ChangeBinInt- 00:09:14.346 [2024-11-20 15:10:52.847862] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.847888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.346 #45 NEW cov: 12485 ft: 14911 corp: 18/404b lim: 35 exec/s: 45 rss: 74Mb L: 19/32 MS: 1 CrossOver- 00:09:14.346 [2024-11-20 15:10:52.907844] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.907869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.346 #50 NEW cov: 12485 ft: 15128 corp: 19/411b lim: 35 exec/s: 50 rss: 74Mb L: 7/32 MS: 5 PersAutoDict-ChangeByte-ShuffleBytes-ChangeByte-InsertRepeatedBytes- DE: "\001\037"- 00:09:14.346 [2024-11-20 15:10:52.948050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.948079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.346 [2024-11-20 15:10:52.948141] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.948155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.346 #51 NEW cov: 12485 ft: 15133 corp: 20/426b lim: 35 exec/s: 51 rss: 74Mb L: 15/32 MS: 1 ChangeBinInt- 00:09:14.346 [2024-11-20 15:10:52.988056] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000012c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.346 [2024-11-20 15:10:52.988081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.346 #52 NEW cov: 12485 ft: 15162 corp: 21/434b lim: 35 exec/s: 52 rss: 74Mb L: 8/32 MS: 1 InsertByte- 00:09:14.606 [2024-11-20 15:10:53.048556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.048581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.606 [2024-11-20 15:10:53.048641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000071f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.048656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.606 #53 NEW cov: 12485 ft: 15174 corp: 22/459b lim: 35 exec/s: 53 rss: 74Mb L: 25/32 MS: 1 ShuffleBytes- 00:09:14.606 [2024-11-20 15:10:53.088672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.088698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.606 #54 NEW cov: 12485 ft: 15178 corp: 23/484b lim: 35 exec/s: 54 rss: 74Mb L: 25/32 MS: 1 CrossOver- 00:09:14.606 [2024-11-20 15:10:53.148966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.148992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.606 [2024-11-20 15:10:53.149055] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.149069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.606 [2024-11-20 15:10:53.149128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.149142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.606 #55 NEW cov: 12485 ft: 15220 corp: 24/515b lim: 35 exec/s: 55 rss: 74Mb L: 31/32 MS: 1 InsertRepeatedBytes- 00:09:14.606 [2024-11-20 15:10:53.188956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.188981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.606 [2024-11-20 15:10:53.189042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.189056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.606 [2024-11-20 15:10:53.189112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.189125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.606 [2024-11-20 15:10:53.189191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.189204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.606 #56 NEW cov: 12485 ft: 15226 corp: 25/543b lim: 35 exec/s: 56 rss: 75Mb L: 28/32 MS: 1 ChangeBit- 00:09:14.606 [2024-11-20 15:10:53.248954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.248979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.606 #57 NEW cov: 12485 ft: 15236 corp: 26/558b lim: 35 exec/s: 57 rss: 75Mb L: 15/32 MS: 1 ChangeBinInt- 00:09:14.606 [2024-11-20 15:10:53.289103] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.289128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.606 [2024-11-20 15:10:53.289187] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES SOFTWARE PROGRESS MARKER cid:5 cdw10:00000780 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.606 [2024-11-20 15:10:53.289202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.867 #58 NEW cov: 12485 ft: 15333 corp: 27/573b lim: 35 exec/s: 58 rss: 75Mb L: 15/32 MS: 1 ChangeBit- 00:09:14.867 [2024-11-20 15:10:53.349515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.349539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.867 [2024-11-20 15:10:53.349611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.349626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.867 [2024-11-20 15:10:53.349681] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.349695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.867 #59 NEW cov: 12485 ft: 15341 corp: 28/606b lim: 35 exec/s: 59 rss: 75Mb L: 33/33 MS: 1 PersAutoDict- DE: "\001\037"- 00:09:14.867 [2024-11-20 15:10:53.409557] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.409583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.867 [2024-11-20 15:10:53.409642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.409657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.867 #65 NEW cov: 12485 ft: 15377 corp: 29/632b lim: 35 exec/s: 65 rss: 75Mb L: 26/33 MS: 1 InsertByte- 00:09:14.867 [2024-11-20 15:10:53.469821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.469846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.867 [2024-11-20 15:10:53.469920] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.469935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.867 [2024-11-20 15:10:53.469995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.470009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.867 #66 NEW cov: 12485 ft: 15378 corp: 30/663b lim: 35 exec/s: 66 rss: 75Mb L: 31/33 MS: 1 CopyPart- 00:09:14.867 [2024-11-20 15:10:53.509596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.509621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.867 [2024-11-20 15:10:53.509681] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.509695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.867 #67 NEW cov: 12485 ft: 15391 corp: 31/678b lim: 35 exec/s: 67 rss: 75Mb L: 15/33 MS: 1 ShuffleBytes- 00:09:14.867 [2024-11-20 15:10:53.549928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.549954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.867 [2024-11-20 15:10:53.550018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.867 [2024-11-20 15:10:53.550033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.127 #68 NEW cov: 12485 ft: 15457 corp: 32/705b lim: 35 exec/s: 68 rss: 75Mb L: 27/33 MS: 1 PersAutoDict- DE: "\001\037"- 00:09:15.127 [2024-11-20 15:10:53.589892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.127 [2024-11-20 15:10:53.589917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.127 #69 NEW cov: 12485 ft: 15467 corp: 33/720b lim: 35 exec/s: 69 rss: 75Mb L: 15/33 MS: 1 EraseBytes- 00:09:15.127 [2024-11-20 15:10:53.650189] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.127 [2024-11-20 15:10:53.650213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.127 [2024-11-20 15:10:53.650289] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.127 [2024-11-20 15:10:53.650304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.127 #70 NEW cov: 12485 ft: 15499 corp: 34/746b lim: 35 exec/s: 70 rss: 75Mb L: 26/33 MS: 1 InsertByte- 00:09:15.127 [2024-11-20 15:10:53.710547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.127 [2024-11-20 15:10:53.710572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.127 [2024-11-20 15:10:53.710634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000012c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.127 [2024-11-20 15:10:53.710649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.127 [2024-11-20 15:10:53.710708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000012c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.127 [2024-11-20 15:10:53.710722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.127 [2024-11-20 15:10:53.710781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.127 [2024-11-20 15:10:53.710795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.127 [2024-11-20 15:10:53.710849] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.127 [2024-11-20 15:10:53.710863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:15.127 #71 NEW cov: 12485 ft: 15542 corp: 35/781b lim: 35 exec/s: 35 rss: 75Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:09:15.127 #71 DONE cov: 12485 ft: 15542 corp: 35/781b lim: 35 exec/s: 35 rss: 75Mb 00:09:15.127 ###### Recommended dictionary. ###### 00:09:15.127 "\001\037" # Uses: 6 00:09:15.127 ###### End of recommended dictionary. ###### 00:09:15.127 Done 71 runs in 2 second(s) 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:15.387 15:10:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:09:15.387 [2024-11-20 15:10:53.871836] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:15.387 [2024-11-20 15:10:53.871909] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1478754 ] 00:09:15.646 [2024-11-20 15:10:54.092225] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:15.646 [2024-11-20 15:10:54.107423] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.646 [2024-11-20 15:10:54.160430] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:15.646 [2024-11-20 15:10:54.176661] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:09:15.646 INFO: Running with entropic power schedule (0xFF, 100). 00:09:15.646 INFO: Seed: 3833209777 00:09:15.646 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:15.646 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:15.646 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:15.646 INFO: A corpus is not provided, starting from an empty corpus 00:09:15.646 #2 INITED exec/s: 0 rss: 66Mb 00:09:15.646 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:15.646 This may also happen if the target rejected all inputs we tried so far 00:09:15.646 [2024-11-20 15:10:54.231932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.646 [2024-11-20 15:10:54.231963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:15.905 NEW_FUNC[1/716]: 0x470188 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:09:15.905 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:15.905 #10 NEW cov: 12310 ft: 12300 corp: 2/36b lim: 105 exec/s: 0 rss: 73Mb L: 35/35 MS: 3 CopyPart-CopyPart-InsertRepeatedBytes- 00:09:15.905 [2024-11-20 15:10:54.572817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.905 [2024-11-20 15:10:54.572852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.166 #11 NEW cov: 12425 ft: 12823 corp: 3/75b lim: 105 exec/s: 0 rss: 74Mb L: 39/39 MS: 1 CMP- DE: "\377\377\376\377"- 00:09:16.167 [2024-11-20 15:10:54.633033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.167 [2024-11-20 15:10:54.633063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.167 [2024-11-20 15:10:54.633126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.167 [2024-11-20 15:10:54.633144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.167 #18 NEW cov: 12431 ft: 13527 corp: 4/127b lim: 105 exec/s: 0 rss: 74Mb L: 52/52 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:09:16.167 [2024-11-20 15:10:54.672992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.167 [2024-11-20 15:10:54.673024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.167 #19 NEW cov: 12516 ft: 13819 corp: 5/163b lim: 105 exec/s: 0 rss: 74Mb L: 36/52 MS: 1 CrossOver- 00:09:16.167 [2024-11-20 15:10:54.713130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.167 [2024-11-20 15:10:54.713159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.167 #30 NEW cov: 12516 ft: 14060 corp: 6/190b lim: 105 exec/s: 0 rss: 74Mb L: 27/52 MS: 1 InsertRepeatedBytes- 00:09:16.167 [2024-11-20 15:10:54.753375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446461684073245695 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.167 [2024-11-20 15:10:54.753402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.167 [2024-11-20 15:10:54.753441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:3110627432037296939 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.167 [2024-11-20 15:10:54.753460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.167 #31 NEW cov: 12516 ft: 14099 corp: 7/241b lim: 105 exec/s: 0 rss: 74Mb L: 51/52 MS: 1 CrossOver- 00:09:16.167 [2024-11-20 15:10:54.813558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.167 [2024-11-20 15:10:54.813586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.167 [2024-11-20 15:10:54.813635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.167 [2024-11-20 15:10:54.813652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.427 #32 NEW cov: 12516 ft: 14167 corp: 8/293b lim: 105 exec/s: 0 rss: 74Mb L: 52/52 MS: 1 ChangeBit- 00:09:16.427 [2024-11-20 15:10:54.873695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.427 [2024-11-20 15:10:54.873722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.427 [2024-11-20 15:10:54.873777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.427 [2024-11-20 15:10:54.873794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.427 #33 NEW cov: 12516 ft: 14221 corp: 9/345b lim: 105 exec/s: 0 rss: 74Mb L: 52/52 MS: 1 CrossOver- 00:09:16.427 [2024-11-20 15:10:54.913673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742970071386667 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.427 [2024-11-20 15:10:54.913701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.427 #35 NEW cov: 12516 ft: 14306 corp: 10/385b lim: 105 exec/s: 0 rss: 74Mb L: 40/52 MS: 2 CopyPart-CrossOver- 00:09:16.427 [2024-11-20 15:10:54.953906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285375 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.427 [2024-11-20 15:10:54.953933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.427 [2024-11-20 15:10:54.953973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.427 [2024-11-20 15:10:54.953989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.427 #36 NEW cov: 12516 ft: 14445 corp: 11/437b lim: 105 exec/s: 0 rss: 74Mb L: 52/52 MS: 1 ChangeByte- 00:09:16.427 [2024-11-20 15:10:55.014107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.427 [2024-11-20 15:10:55.014134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.427 [2024-11-20 15:10:55.014172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.427 [2024-11-20 15:10:55.014189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.427 #37 NEW cov: 12516 ft: 14469 corp: 12/489b lim: 105 exec/s: 0 rss: 74Mb L: 52/52 MS: 1 ChangeBit- 00:09:16.427 [2024-11-20 15:10:55.054060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.427 [2024-11-20 15:10:55.054091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.427 #38 NEW cov: 12516 ft: 14534 corp: 13/528b lim: 105 exec/s: 0 rss: 74Mb L: 39/52 MS: 1 ShuffleBytes- 00:09:16.688 [2024-11-20 15:10:55.114422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.688 [2024-11-20 15:10:55.114451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.688 [2024-11-20 15:10:55.114501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.688 [2024-11-20 15:10:55.114518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.688 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:16.688 #39 NEW cov: 12533 ft: 14586 corp: 14/580b lim: 105 exec/s: 0 rss: 74Mb L: 52/52 MS: 1 CopyPart- 00:09:16.688 [2024-11-20 15:10:55.154485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.688 [2024-11-20 15:10:55.154511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.688 [2024-11-20 15:10:55.154566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.688 [2024-11-20 15:10:55.154582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.688 #45 NEW cov: 12533 ft: 14640 corp: 15/632b lim: 105 exec/s: 0 rss: 74Mb L: 52/52 MS: 1 CrossOver- 00:09:16.688 [2024-11-20 15:10:55.214693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.688 [2024-11-20 15:10:55.214722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.688 [2024-11-20 15:10:55.214775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.688 [2024-11-20 15:10:55.214792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.688 #46 NEW cov: 12533 ft: 14689 corp: 16/684b lim: 105 exec/s: 46 rss: 74Mb L: 52/52 MS: 1 CrossOver- 00:09:16.688 [2024-11-20 15:10:55.254750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.689 [2024-11-20 15:10:55.254778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.689 [2024-11-20 15:10:55.254816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.689 [2024-11-20 15:10:55.254832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.689 #47 NEW cov: 12533 ft: 14722 corp: 17/733b lim: 105 exec/s: 47 rss: 74Mb L: 49/52 MS: 1 CrossOver- 00:09:16.689 [2024-11-20 15:10:55.314822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:65280 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.689 [2024-11-20 15:10:55.314850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.689 #48 NEW cov: 12533 ft: 14795 corp: 18/772b lim: 105 exec/s: 48 rss: 74Mb L: 39/52 MS: 1 PersAutoDict- DE: "\377\377\376\377"- 00:09:16.989 [2024-11-20 15:10:55.375326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.989 [2024-11-20 15:10:55.375360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.989 [2024-11-20 15:10:55.375416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:3110627805699451691 len:33411 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.989 [2024-11-20 15:10:55.375431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.990 [2024-11-20 15:10:55.375488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9404222468949967490 len:33411 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.375502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:16.990 #49 NEW cov: 12533 ft: 15108 corp: 19/850b lim: 105 exec/s: 49 rss: 74Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:09:16.990 [2024-11-20 15:10:55.415356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.415386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.990 [2024-11-20 15:10:55.415437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:3137649403463674667 len:33411 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.415454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.990 [2024-11-20 15:10:55.415507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9404222468949967490 len:33411 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.415523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:16.990 #50 NEW cov: 12533 ft: 15138 corp: 20/928b lim: 105 exec/s: 50 rss: 74Mb L: 78/78 MS: 1 ChangeByte- 00:09:16.990 [2024-11-20 15:10:55.475252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.475281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.990 #51 NEW cov: 12533 ft: 15226 corp: 21/967b lim: 105 exec/s: 51 rss: 74Mb L: 39/78 MS: 1 PersAutoDict- DE: "\377\377\376\377"- 00:09:16.990 [2024-11-20 15:10:55.515402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.515431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.990 #52 NEW cov: 12533 ft: 15272 corp: 22/994b lim: 105 exec/s: 52 rss: 74Mb L: 27/78 MS: 1 CMP- DE: "\000\000\000\000"- 00:09:16.990 [2024-11-20 15:10:55.575538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:648518346509715721 len:2315 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.575564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.990 #53 NEW cov: 12533 ft: 15319 corp: 23/1026b lim: 105 exec/s: 53 rss: 74Mb L: 32/78 MS: 1 CrossOver- 00:09:16.990 [2024-11-20 15:10:55.615804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.615833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:16.990 [2024-11-20 15:10:55.615874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.615894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:16.990 #54 NEW cov: 12533 ft: 15339 corp: 24/1078b lim: 105 exec/s: 54 rss: 74Mb L: 52/78 MS: 1 ShuffleBytes- 00:09:16.990 [2024-11-20 15:10:55.655817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.990 [2024-11-20 15:10:55.655845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.254 #55 NEW cov: 12533 ft: 15352 corp: 25/1117b lim: 105 exec/s: 55 rss: 74Mb L: 39/78 MS: 1 PersAutoDict- DE: "\377\377\376\377"- 00:09:17.254 [2024-11-20 15:10:55.695923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742970071386667 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.254 [2024-11-20 15:10:55.695952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.254 #56 NEW cov: 12533 ft: 15360 corp: 26/1157b lim: 105 exec/s: 56 rss: 75Mb L: 40/78 MS: 1 ChangeBit- 00:09:17.254 [2024-11-20 15:10:55.756088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.255 [2024-11-20 15:10:55.756115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.255 #57 NEW cov: 12533 ft: 15397 corp: 27/1196b lim: 105 exec/s: 57 rss: 75Mb L: 39/78 MS: 1 ChangeBinInt- 00:09:17.255 [2024-11-20 15:10:55.816470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285375 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.255 [2024-11-20 15:10:55.816497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.255 [2024-11-20 15:10:55.816567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.255 [2024-11-20 15:10:55.816584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:17.255 [2024-11-20 15:10:55.816637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.255 [2024-11-20 15:10:55.816653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:17.255 #58 NEW cov: 12533 ft: 15418 corp: 28/1276b lim: 105 exec/s: 58 rss: 75Mb L: 80/80 MS: 1 CopyPart- 00:09:17.255 [2024-11-20 15:10:55.876501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.255 [2024-11-20 15:10:55.876529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.255 [2024-11-20 15:10:55.876583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.255 [2024-11-20 15:10:55.876600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:17.255 #59 NEW cov: 12533 ft: 15426 corp: 29/1328b lim: 105 exec/s: 59 rss: 75Mb L: 52/80 MS: 1 ChangeBit- 00:09:17.255 [2024-11-20 15:10:55.916622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:6169 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.255 [2024-11-20 15:10:55.916650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.255 [2024-11-20 15:10:55.916689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.255 [2024-11-20 15:10:55.916708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:17.532 #60 NEW cov: 12533 ft: 15440 corp: 30/1387b lim: 105 exec/s: 60 rss: 75Mb L: 59/80 MS: 1 InsertRepeatedBytes- 00:09:17.532 [2024-11-20 15:10:55.976793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:651062659349285129 len:6923 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:55.976820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.532 [2024-11-20 15:10:55.976875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:55.976892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:17.532 #61 NEW cov: 12533 ft: 15460 corp: 31/1440b lim: 105 exec/s: 61 rss: 75Mb L: 53/80 MS: 1 InsertByte- 00:09:17.532 [2024-11-20 15:10:56.016837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:56.016864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.532 #62 NEW cov: 12533 ft: 15467 corp: 32/1467b lim: 105 exec/s: 62 rss: 75Mb L: 27/80 MS: 1 ChangeByte- 00:09:17.532 [2024-11-20 15:10:56.057135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:56.057163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.532 [2024-11-20 15:10:56.057212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:56.057228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:17.532 [2024-11-20 15:10:56.057285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:56.057300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:17.532 #63 NEW cov: 12533 ft: 15493 corp: 33/1544b lim: 105 exec/s: 63 rss: 75Mb L: 77/80 MS: 1 InsertRepeatedBytes- 00:09:17.532 [2024-11-20 15:10:56.117275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:56.117303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.532 [2024-11-20 15:10:56.117377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:3137649403463674667 len:33411 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:56.117394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:17.532 [2024-11-20 15:10:56.117452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9404222468949967490 len:33411 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:56.117469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:17.532 #64 NEW cov: 12540 ft: 15514 corp: 34/1623b lim: 105 exec/s: 64 rss: 75Mb L: 79/80 MS: 1 InsertByte- 00:09:17.532 [2024-11-20 15:10:56.177213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.532 [2024-11-20 15:10:56.177241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.532 #65 NEW cov: 12540 ft: 15528 corp: 35/1659b lim: 105 exec/s: 65 rss: 75Mb L: 36/80 MS: 1 ChangeByte- 00:09:17.824 [2024-11-20 15:10:56.217495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627431481486123 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.824 [2024-11-20 15:10:56.217522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.824 [2024-11-20 15:10:56.217563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18374452469102804991 len:11052 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.824 [2024-11-20 15:10:56.217578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:17.824 #66 NEW cov: 12540 ft: 15559 corp: 36/1702b lim: 105 exec/s: 33 rss: 75Mb L: 43/80 MS: 1 PersAutoDict- DE: "\377\377\376\377"- 00:09:17.824 #66 DONE cov: 12540 ft: 15559 corp: 36/1702b lim: 105 exec/s: 33 rss: 75Mb 00:09:17.824 ###### Recommended dictionary. ###### 00:09:17.824 "\377\377\376\377" # Uses: 6 00:09:17.824 "\000\000\000\000" # Uses: 0 00:09:17.824 ###### End of recommended dictionary. ###### 00:09:17.824 Done 66 runs in 2 second(s) 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:09:17.824 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:17.825 15:10:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:09:17.825 [2024-11-20 15:10:56.375626] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:17.825 [2024-11-20 15:10:56.375698] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1479109 ] 00:09:18.127 [2024-11-20 15:10:56.591182] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:18.127 [2024-11-20 15:10:56.606142] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.127 [2024-11-20 15:10:56.658972] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:18.127 [2024-11-20 15:10:56.675208] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:09:18.127 INFO: Running with entropic power schedule (0xFF, 100). 00:09:18.127 INFO: Seed: 2036243234 00:09:18.127 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:18.127 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:18.127 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:18.127 INFO: A corpus is not provided, starting from an empty corpus 00:09:18.127 #2 INITED exec/s: 0 rss: 66Mb 00:09:18.127 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:18.127 This may also happen if the target rejected all inputs we tried so far 00:09:18.127 [2024-11-20 15:10:56.753557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.127 [2024-11-20 15:10:56.753606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.127 [2024-11-20 15:10:56.753719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.127 [2024-11-20 15:10:56.753737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.127 [2024-11-20 15:10:56.753835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.127 [2024-11-20 15:10:56.753858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.127 [2024-11-20 15:10:56.753976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.127 [2024-11-20 15:10:56.754000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:18.450 NEW_FUNC[1/717]: 0x473508 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:09:18.450 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:18.450 #6 NEW cov: 12315 ft: 12316 corp: 2/99b lim: 120 exec/s: 0 rss: 73Mb L: 98/98 MS: 4 ShuffleBytes-InsertRepeatedBytes-InsertByte-InsertRepeatedBytes- 00:09:18.450 [2024-11-20 15:10:57.093781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.450 [2024-11-20 15:10:57.093823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.450 [2024-11-20 15:10:57.093914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.450 [2024-11-20 15:10:57.093933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.450 [2024-11-20 15:10:57.094023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.450 [2024-11-20 15:10:57.094039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.450 [2024-11-20 15:10:57.094137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.450 [2024-11-20 15:10:57.094156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:18.723 #7 NEW cov: 12445 ft: 12959 corp: 3/197b lim: 120 exec/s: 0 rss: 74Mb L: 98/98 MS: 1 ChangeBit- 00:09:18.723 [2024-11-20 15:10:57.164129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.723 [2024-11-20 15:10:57.164160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.723 [2024-11-20 15:10:57.164231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.723 [2024-11-20 15:10:57.164252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.723 [2024-11-20 15:10:57.164331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.723 [2024-11-20 15:10:57.164361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.723 [2024-11-20 15:10:57.164455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.723 [2024-11-20 15:10:57.164473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:18.723 #8 NEW cov: 12451 ft: 13204 corp: 4/295b lim: 120 exec/s: 0 rss: 74Mb L: 98/98 MS: 1 ShuffleBytes- 00:09:18.723 [2024-11-20 15:10:57.234302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.723 [2024-11-20 15:10:57.234336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.723 [2024-11-20 15:10:57.234406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.723 [2024-11-20 15:10:57.234426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.723 [2024-11-20 15:10:57.234498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.723 [2024-11-20 15:10:57.234517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.723 [2024-11-20 15:10:57.234605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.724 [2024-11-20 15:10:57.234625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:18.724 #9 NEW cov: 12536 ft: 13468 corp: 5/413b lim: 120 exec/s: 0 rss: 74Mb L: 118/118 MS: 1 CopyPart- 00:09:18.724 [2024-11-20 15:10:57.284170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446463702539436031 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.724 [2024-11-20 15:10:57.284201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.724 [2024-11-20 15:10:57.284281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:262144 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.724 [2024-11-20 15:10:57.284302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.724 [2024-11-20 15:10:57.284372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.724 [2024-11-20 15:10:57.284392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.724 #14 NEW cov: 12536 ft: 13875 corp: 6/486b lim: 120 exec/s: 0 rss: 74Mb L: 73/118 MS: 5 ChangeByte-InsertRepeatedBytes-InsertByte-CMP-CrossOver- DE: "\001\000\000\000\000\000\000\000"- 00:09:18.724 [2024-11-20 15:10:57.333754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3110627432037296939 len:11052 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.724 [2024-11-20 15:10:57.333789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.724 #16 NEW cov: 12536 ft: 14720 corp: 7/519b lim: 120 exec/s: 0 rss: 74Mb L: 33/118 MS: 2 ChangeBit-InsertRepeatedBytes- 00:09:18.724 [2024-11-20 15:10:57.384559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073440067583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.724 [2024-11-20 15:10:57.384591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.724 [2024-11-20 15:10:57.384665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.724 [2024-11-20 15:10:57.384683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.724 [2024-11-20 15:10:57.384748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:18.724 [2024-11-20 15:10:57.384765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.003 #19 NEW cov: 12536 ft: 14863 corp: 8/603b lim: 120 exec/s: 0 rss: 74Mb L: 84/118 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:09:19.003 [2024-11-20 15:10:57.434963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446463702539436031 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.434994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.435068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:262144 len:240 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.435088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.435156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.435174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.003 #20 NEW cov: 12536 ft: 14882 corp: 9/676b lim: 120 exec/s: 0 rss: 74Mb L: 73/118 MS: 1 CMP- DE: "\357\245\346`\242$E\000"- 00:09:19.003 [2024-11-20 15:10:57.505553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.505583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.505670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.505689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.505754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.505773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.505875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.505892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:19.003 #21 NEW cov: 12536 ft: 14987 corp: 10/774b lim: 120 exec/s: 0 rss: 74Mb L: 98/118 MS: 1 ChangeBit- 00:09:19.003 [2024-11-20 15:10:57.555727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.555760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.555839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.555857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.555922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.555939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.556031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.556050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:19.003 #22 NEW cov: 12536 ft: 15024 corp: 11/892b lim: 120 exec/s: 0 rss: 74Mb L: 118/118 MS: 1 ChangeASCIIInt- 00:09:19.003 [2024-11-20 15:10:57.625964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10977524259487744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.625992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.626068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.626090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.626159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.626181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.003 [2024-11-20 15:10:57.626271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.003 [2024-11-20 15:10:57.626287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:19.003 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:19.003 #23 NEW cov: 12559 ft: 15050 corp: 12/1011b lim: 120 exec/s: 0 rss: 74Mb L: 119/119 MS: 1 InsertByte- 00:09:19.312 [2024-11-20 15:10:57.696348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.696380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.312 [2024-11-20 15:10:57.696462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.696482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.312 [2024-11-20 15:10:57.696565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.696582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.312 [2024-11-20 15:10:57.696671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.696691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:19.312 #24 NEW cov: 12559 ft: 15060 corp: 13/1117b lim: 120 exec/s: 0 rss: 74Mb L: 106/119 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:09:19.312 [2024-11-20 15:10:57.745679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3110627432037296939 len:11052 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.745707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.312 #25 NEW cov: 12559 ft: 15089 corp: 14/1150b lim: 120 exec/s: 25 rss: 74Mb L: 33/119 MS: 1 ChangeBinInt- 00:09:19.312 [2024-11-20 15:10:57.816586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073440067583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.816617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.312 [2024-11-20 15:10:57.816694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.816714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.312 [2024-11-20 15:10:57.816767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:12800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.816790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.312 #26 NEW cov: 12559 ft: 15099 corp: 15/1234b lim: 120 exec/s: 26 rss: 74Mb L: 84/119 MS: 1 ChangeByte- 00:09:19.312 [2024-11-20 15:10:57.886430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3110627432037296939 len:11052 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.886473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.312 #27 NEW cov: 12559 ft: 15133 corp: 16/1268b lim: 120 exec/s: 27 rss: 74Mb L: 34/119 MS: 1 InsertByte- 00:09:19.312 [2024-11-20 15:10:57.957507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446463702539436031 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.957539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.312 [2024-11-20 15:10:57.957618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1024 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.957639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.312 [2024-11-20 15:10:57.957701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.312 [2024-11-20 15:10:57.957720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.312 #28 NEW cov: 12559 ft: 15149 corp: 17/1342b lim: 120 exec/s: 28 rss: 74Mb L: 74/119 MS: 1 InsertByte- 00:09:19.581 [2024-11-20 15:10:58.008399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.008432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.581 [2024-11-20 15:10:58.008505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:32768 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.008523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.581 [2024-11-20 15:10:58.008577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.008597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.581 [2024-11-20 15:10:58.008687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.008706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:19.581 #29 NEW cov: 12559 ft: 15164 corp: 18/1440b lim: 120 exec/s: 29 rss: 74Mb L: 98/119 MS: 1 ChangeBit- 00:09:19.581 [2024-11-20 15:10:58.077861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073438363647 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.077894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.581 #33 NEW cov: 12559 ft: 15201 corp: 19/1478b lim: 120 exec/s: 33 rss: 75Mb L: 38/119 MS: 4 CrossOver-CrossOver-ChangeByte-CrossOver- 00:09:19.581 [2024-11-20 15:10:58.158890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446463702539436031 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.158922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.581 [2024-11-20 15:10:58.159014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:262144 len:240 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.159034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.581 [2024-11-20 15:10:58.159131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.159150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.581 #39 NEW cov: 12559 ft: 15241 corp: 20/1551b lim: 120 exec/s: 39 rss: 75Mb L: 73/119 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:09:19.581 [2024-11-20 15:10:58.228818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168427776 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.228849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.581 [2024-11-20 15:10:58.228954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1026497183744 len:24739 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.581 [2024-11-20 15:10:58.228971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.581 #41 NEW cov: 12559 ft: 15567 corp: 21/1600b lim: 120 exec/s: 41 rss: 75Mb L: 49/119 MS: 2 CopyPart-CrossOver- 00:09:19.841 [2024-11-20 15:10:58.279441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073440067583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.279473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.279554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.279572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.279635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073695133695 len:65330 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.279652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.841 #42 NEW cov: 12559 ft: 15577 corp: 22/1685b lim: 120 exec/s: 42 rss: 75Mb L: 85/119 MS: 1 InsertByte- 00:09:19.841 [2024-11-20 15:10:58.329746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073440067583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.329777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.329828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.329850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.329914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073695133695 len:65330 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.329930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.841 #43 NEW cov: 12559 ft: 15589 corp: 23/1766b lim: 120 exec/s: 43 rss: 75Mb L: 81/119 MS: 1 EraseBytes- 00:09:19.841 [2024-11-20 15:10:58.400394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073440067583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.400423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.400496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.400515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.400594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.400611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.400703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551395 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.400719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:19.841 #44 NEW cov: 12559 ft: 15621 corp: 24/1873b lim: 120 exec/s: 44 rss: 75Mb L: 107/119 MS: 1 InsertRepeatedBytes- 00:09:19.841 [2024-11-20 15:10:58.470599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.470627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.470718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.470736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.470803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.470821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.470912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.470935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:19.841 #45 NEW cov: 12559 ft: 15640 corp: 25/1971b lim: 120 exec/s: 45 rss: 75Mb L: 98/119 MS: 1 ShuffleBytes- 00:09:19.841 [2024-11-20 15:10:58.521029] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.521060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.521141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.521162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.521244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.521266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.841 [2024-11-20 15:10:58.521351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:13825 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.841 [2024-11-20 15:10:58.521369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:20.100 #46 NEW cov: 12559 ft: 15661 corp: 26/2077b lim: 120 exec/s: 46 rss: 75Mb L: 106/119 MS: 1 CopyPart- 00:09:20.100 [2024-11-20 15:10:58.591393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:171324928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.591425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.591503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.591522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.591600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.591615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.591708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:256 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.591725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:20.100 #47 NEW cov: 12559 ft: 15733 corp: 27/2191b lim: 120 exec/s: 47 rss: 75Mb L: 114/119 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:09:20.100 [2024-11-20 15:10:58.641676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.641705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.641778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.641797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.641873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.641894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.641985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.642008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:20.100 #48 NEW cov: 12559 ft: 15769 corp: 28/2309b lim: 120 exec/s: 48 rss: 75Mb L: 118/119 MS: 1 CopyPart- 00:09:20.100 [2024-11-20 15:10:58.691596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073440067583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.691625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.691697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.691717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.691790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.691809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:20.100 #49 NEW cov: 12559 ft: 15799 corp: 29/2393b lim: 120 exec/s: 49 rss: 75Mb L: 84/119 MS: 1 ShuffleBytes- 00:09:20.100 [2024-11-20 15:10:58.742039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073440067583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.742068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.742139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.742161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.100 [2024-11-20 15:10:58.742252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073695133695 len:65330 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.100 [2024-11-20 15:10:58.742272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:20.100 #50 NEW cov: 12559 ft: 15817 corp: 30/2474b lim: 120 exec/s: 25 rss: 75Mb L: 81/119 MS: 1 ShuffleBytes- 00:09:20.100 #50 DONE cov: 12559 ft: 15817 corp: 30/2474b lim: 120 exec/s: 25 rss: 75Mb 00:09:20.100 ###### Recommended dictionary. ###### 00:09:20.100 "\001\000\000\000\000\000\000\000" # Uses: 3 00:09:20.100 "\357\245\346`\242$E\000" # Uses: 0 00:09:20.100 ###### End of recommended dictionary. ###### 00:09:20.100 Done 50 runs in 2 second(s) 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:09:20.359 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:20.360 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:09:20.360 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:20.360 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:20.360 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:20.360 15:10:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:09:20.360 [2024-11-20 15:10:58.906564] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:20.360 [2024-11-20 15:10:58.906636] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1479482 ] 00:09:20.620 [2024-11-20 15:10:59.123477] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.620 [2024-11-20 15:10:59.138049] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.620 [2024-11-20 15:10:59.190844] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:20.620 [2024-11-20 15:10:59.207081] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:09:20.620 INFO: Running with entropic power schedule (0xFF, 100). 00:09:20.620 INFO: Seed: 272260439 00:09:20.620 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:20.620 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:20.620 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:20.620 INFO: A corpus is not provided, starting from an empty corpus 00:09:20.620 #2 INITED exec/s: 0 rss: 66Mb 00:09:20.620 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:20.620 This may also happen if the target rejected all inputs we tried so far 00:09:20.620 [2024-11-20 15:10:59.262592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:20.620 [2024-11-20 15:10:59.262622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.620 [2024-11-20 15:10:59.262673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:20.621 [2024-11-20 15:10:59.262689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.621 [2024-11-20 15:10:59.262741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:20.621 [2024-11-20 15:10:59.262757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.138 NEW_FUNC[1/715]: 0x476df8 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:09:21.138 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:21.139 #9 NEW cov: 12276 ft: 12273 corp: 2/64b lim: 100 exec/s: 0 rss: 73Mb L: 63/63 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:09:21.139 [2024-11-20 15:10:59.603498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.139 [2024-11-20 15:10:59.603536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.139 [2024-11-20 15:10:59.603610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.139 [2024-11-20 15:10:59.603625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.139 [2024-11-20 15:10:59.603678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:21.139 [2024-11-20 15:10:59.603693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.139 #20 NEW cov: 12389 ft: 12876 corp: 3/127b lim: 100 exec/s: 0 rss: 74Mb L: 63/63 MS: 1 ChangeBit- 00:09:21.139 [2024-11-20 15:10:59.663562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.139 [2024-11-20 15:10:59.663589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.139 [2024-11-20 15:10:59.663635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.139 [2024-11-20 15:10:59.663651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.139 [2024-11-20 15:10:59.663703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:21.139 [2024-11-20 15:10:59.663717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.139 #21 NEW cov: 12395 ft: 13250 corp: 4/190b lim: 100 exec/s: 0 rss: 74Mb L: 63/63 MS: 1 CrossOver- 00:09:21.139 [2024-11-20 15:10:59.703396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.139 [2024-11-20 15:10:59.703422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.139 #22 NEW cov: 12480 ft: 13968 corp: 5/226b lim: 100 exec/s: 0 rss: 74Mb L: 36/63 MS: 1 EraseBytes- 00:09:21.139 [2024-11-20 15:10:59.763753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.139 [2024-11-20 15:10:59.763780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.139 [2024-11-20 15:10:59.763827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.139 [2024-11-20 15:10:59.763842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.139 [2024-11-20 15:10:59.763896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:21.139 [2024-11-20 15:10:59.763909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.139 #23 NEW cov: 12480 ft: 14032 corp: 6/293b lim: 100 exec/s: 0 rss: 74Mb L: 67/67 MS: 1 CMP- DE: "\000\000\000\000"- 00:09:21.139 [2024-11-20 15:10:59.823846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.139 [2024-11-20 15:10:59.823874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.397 [2024-11-20 15:10:59.823918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.397 [2024-11-20 15:10:59.823934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.397 #28 NEW cov: 12480 ft: 14360 corp: 7/346b lim: 100 exec/s: 0 rss: 74Mb L: 53/67 MS: 5 CopyPart-ChangeBit-ChangeBinInt-CopyPart-InsertRepeatedBytes- 00:09:21.397 [2024-11-20 15:10:59.863959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.397 [2024-11-20 15:10:59.863986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.397 [2024-11-20 15:10:59.864039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.397 [2024-11-20 15:10:59.864055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.397 #34 NEW cov: 12480 ft: 14415 corp: 8/399b lim: 100 exec/s: 0 rss: 74Mb L: 53/67 MS: 1 CopyPart- 00:09:21.397 [2024-11-20 15:10:59.924099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.397 [2024-11-20 15:10:59.924126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.397 [2024-11-20 15:10:59.924178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.397 [2024-11-20 15:10:59.924192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.397 #35 NEW cov: 12480 ft: 14483 corp: 9/454b lim: 100 exec/s: 0 rss: 74Mb L: 55/67 MS: 1 CMP- DE: "\034\000"- 00:09:21.397 [2024-11-20 15:10:59.964321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.397 [2024-11-20 15:10:59.964348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.397 [2024-11-20 15:10:59.964413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.397 [2024-11-20 15:10:59.964428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.397 [2024-11-20 15:10:59.964481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:21.397 [2024-11-20 15:10:59.964495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.397 #36 NEW cov: 12480 ft: 14563 corp: 10/517b lim: 100 exec/s: 0 rss: 74Mb L: 63/67 MS: 1 ChangeByte- 00:09:21.397 [2024-11-20 15:11:00.004440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.397 [2024-11-20 15:11:00.004469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.397 [2024-11-20 15:11:00.004519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.397 [2024-11-20 15:11:00.004532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.397 #37 NEW cov: 12480 ft: 14647 corp: 11/570b lim: 100 exec/s: 0 rss: 74Mb L: 53/67 MS: 1 ShuffleBytes- 00:09:21.397 [2024-11-20 15:11:00.064470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.397 [2024-11-20 15:11:00.064505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.656 #38 NEW cov: 12480 ft: 14721 corp: 12/606b lim: 100 exec/s: 0 rss: 74Mb L: 36/67 MS: 1 ChangeBit- 00:09:21.656 [2024-11-20 15:11:00.124747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.656 [2024-11-20 15:11:00.124782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.656 [2024-11-20 15:11:00.124840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.656 [2024-11-20 15:11:00.124856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.656 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:21.656 #44 NEW cov: 12503 ft: 14760 corp: 13/659b lim: 100 exec/s: 0 rss: 74Mb L: 53/67 MS: 1 ChangeBinInt- 00:09:21.656 [2024-11-20 15:11:00.184885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.656 [2024-11-20 15:11:00.184914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.656 [2024-11-20 15:11:00.184972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.656 [2024-11-20 15:11:00.184988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.656 #45 NEW cov: 12503 ft: 14796 corp: 14/712b lim: 100 exec/s: 0 rss: 74Mb L: 53/67 MS: 1 ChangeBit- 00:09:21.656 [2024-11-20 15:11:00.224983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.656 [2024-11-20 15:11:00.225009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.656 [2024-11-20 15:11:00.225047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.656 [2024-11-20 15:11:00.225062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.656 #46 NEW cov: 12503 ft: 14863 corp: 15/767b lim: 100 exec/s: 46 rss: 74Mb L: 55/67 MS: 1 PersAutoDict- DE: "\034\000"- 00:09:21.656 [2024-11-20 15:11:00.285256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.656 [2024-11-20 15:11:00.285283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.656 [2024-11-20 15:11:00.285333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.656 [2024-11-20 15:11:00.285349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.656 [2024-11-20 15:11:00.285402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:21.656 [2024-11-20 15:11:00.285416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.656 #47 NEW cov: 12503 ft: 14872 corp: 16/836b lim: 100 exec/s: 47 rss: 74Mb L: 69/69 MS: 1 PersAutoDict- DE: "\034\000"- 00:09:21.914 [2024-11-20 15:11:00.345309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.914 [2024-11-20 15:11:00.345340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.914 [2024-11-20 15:11:00.345387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.914 [2024-11-20 15:11:00.345408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.914 #48 NEW cov: 12503 ft: 14939 corp: 17/892b lim: 100 exec/s: 48 rss: 74Mb L: 56/69 MS: 1 InsertByte- 00:09:21.914 [2024-11-20 15:11:00.405489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.914 [2024-11-20 15:11:00.405515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.914 [2024-11-20 15:11:00.405551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.914 [2024-11-20 15:11:00.405575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.914 #49 NEW cov: 12503 ft: 14947 corp: 18/945b lim: 100 exec/s: 49 rss: 74Mb L: 53/69 MS: 1 ChangeBit- 00:09:21.914 [2024-11-20 15:11:00.445596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.914 [2024-11-20 15:11:00.445621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.914 [2024-11-20 15:11:00.445663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.914 [2024-11-20 15:11:00.445682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.914 #50 NEW cov: 12503 ft: 14954 corp: 19/1001b lim: 100 exec/s: 50 rss: 74Mb L: 56/69 MS: 1 CopyPart- 00:09:21.914 [2024-11-20 15:11:00.485854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.914 [2024-11-20 15:11:00.485881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.914 [2024-11-20 15:11:00.485929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.914 [2024-11-20 15:11:00.485945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.914 [2024-11-20 15:11:00.485994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:21.914 [2024-11-20 15:11:00.486008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.914 #51 NEW cov: 12503 ft: 14992 corp: 20/1064b lim: 100 exec/s: 51 rss: 74Mb L: 63/69 MS: 1 ChangeBit- 00:09:21.914 [2024-11-20 15:11:00.525719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.914 [2024-11-20 15:11:00.525747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.914 #52 NEW cov: 12503 ft: 15052 corp: 21/1100b lim: 100 exec/s: 52 rss: 74Mb L: 36/69 MS: 1 CopyPart- 00:09:21.914 [2024-11-20 15:11:00.566053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:21.914 [2024-11-20 15:11:00.566080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.914 [2024-11-20 15:11:00.566144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:21.914 [2024-11-20 15:11:00.566159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.914 [2024-11-20 15:11:00.566211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:21.914 [2024-11-20 15:11:00.566225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.914 #53 NEW cov: 12503 ft: 15077 corp: 22/1167b lim: 100 exec/s: 53 rss: 74Mb L: 67/69 MS: 1 ChangeBit- 00:09:22.172 [2024-11-20 15:11:00.606191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.172 [2024-11-20 15:11:00.606217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.606251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.172 [2024-11-20 15:11:00.606265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.606324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.172 [2024-11-20 15:11:00.606339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.172 #54 NEW cov: 12503 ft: 15087 corp: 23/1230b lim: 100 exec/s: 54 rss: 74Mb L: 63/69 MS: 1 ChangeByte- 00:09:22.172 [2024-11-20 15:11:00.646407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.172 [2024-11-20 15:11:00.646435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.646484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.172 [2024-11-20 15:11:00.646499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.646553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.172 [2024-11-20 15:11:00.646569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.646625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:22.172 [2024-11-20 15:11:00.646640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:22.172 #55 NEW cov: 12503 ft: 15362 corp: 24/1321b lim: 100 exec/s: 55 rss: 75Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:09:22.172 [2024-11-20 15:11:00.706586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.172 [2024-11-20 15:11:00.706613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.706658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.172 [2024-11-20 15:11:00.706673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.706725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.172 [2024-11-20 15:11:00.706739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.706795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:22.172 [2024-11-20 15:11:00.706810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:22.172 #56 NEW cov: 12503 ft: 15391 corp: 25/1415b lim: 100 exec/s: 56 rss: 75Mb L: 94/94 MS: 1 InsertRepeatedBytes- 00:09:22.172 [2024-11-20 15:11:00.766471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.172 [2024-11-20 15:11:00.766497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.172 [2024-11-20 15:11:00.766550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.172 [2024-11-20 15:11:00.766565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.173 #57 NEW cov: 12503 ft: 15407 corp: 26/1471b lim: 100 exec/s: 57 rss: 75Mb L: 56/94 MS: 1 PersAutoDict- DE: "\034\000"- 00:09:22.173 [2024-11-20 15:11:00.826628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.173 [2024-11-20 15:11:00.826654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.173 [2024-11-20 15:11:00.826692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.173 [2024-11-20 15:11:00.826712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.430 #58 NEW cov: 12503 ft: 15429 corp: 27/1511b lim: 100 exec/s: 58 rss: 75Mb L: 40/94 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:09:22.430 [2024-11-20 15:11:00.886693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.430 [2024-11-20 15:11:00.886722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.430 #59 NEW cov: 12503 ft: 15448 corp: 28/1536b lim: 100 exec/s: 59 rss: 75Mb L: 25/94 MS: 1 EraseBytes- 00:09:22.430 [2024-11-20 15:11:00.947186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.430 [2024-11-20 15:11:00.947214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.430 [2024-11-20 15:11:00.947267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.430 [2024-11-20 15:11:00.947282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.430 [2024-11-20 15:11:00.947349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.430 [2024-11-20 15:11:00.947364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.430 [2024-11-20 15:11:00.947416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:22.430 [2024-11-20 15:11:00.947429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:22.430 #60 NEW cov: 12503 ft: 15487 corp: 29/1627b lim: 100 exec/s: 60 rss: 75Mb L: 91/94 MS: 1 CopyPart- 00:09:22.430 [2024-11-20 15:11:01.007154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.430 [2024-11-20 15:11:01.007180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.431 [2024-11-20 15:11:01.007217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.431 [2024-11-20 15:11:01.007238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.431 #61 NEW cov: 12503 ft: 15496 corp: 30/1680b lim: 100 exec/s: 61 rss: 75Mb L: 53/94 MS: 1 CopyPart- 00:09:22.431 [2024-11-20 15:11:01.047478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.431 [2024-11-20 15:11:01.047505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.431 [2024-11-20 15:11:01.047560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.431 [2024-11-20 15:11:01.047574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.431 [2024-11-20 15:11:01.047628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.431 [2024-11-20 15:11:01.047643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.431 [2024-11-20 15:11:01.047699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:22.431 [2024-11-20 15:11:01.047717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:22.431 #62 NEW cov: 12503 ft: 15509 corp: 31/1775b lim: 100 exec/s: 62 rss: 75Mb L: 95/95 MS: 1 InsertRepeatedBytes- 00:09:22.431 [2024-11-20 15:11:01.087373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.431 [2024-11-20 15:11:01.087399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.431 [2024-11-20 15:11:01.087459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.431 [2024-11-20 15:11:01.087486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.431 #63 NEW cov: 12503 ft: 15561 corp: 32/1830b lim: 100 exec/s: 63 rss: 75Mb L: 55/95 MS: 1 PersAutoDict- DE: "\034\000"- 00:09:22.688 [2024-11-20 15:11:01.127589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.689 [2024-11-20 15:11:01.127616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.689 [2024-11-20 15:11:01.127651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.689 [2024-11-20 15:11:01.127674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.689 [2024-11-20 15:11:01.127729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.689 [2024-11-20 15:11:01.127753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.689 #64 NEW cov: 12503 ft: 15609 corp: 33/1905b lim: 100 exec/s: 64 rss: 75Mb L: 75/95 MS: 1 InsertRepeatedBytes- 00:09:22.689 [2024-11-20 15:11:01.167472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.689 [2024-11-20 15:11:01.167499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.689 #65 NEW cov: 12503 ft: 15621 corp: 34/1932b lim: 100 exec/s: 65 rss: 75Mb L: 27/95 MS: 1 EraseBytes- 00:09:22.689 [2024-11-20 15:11:01.207678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.689 [2024-11-20 15:11:01.207704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.689 [2024-11-20 15:11:01.207757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.689 [2024-11-20 15:11:01.207771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.689 #66 NEW cov: 12503 ft: 15689 corp: 35/1985b lim: 100 exec/s: 33 rss: 75Mb L: 53/95 MS: 1 ShuffleBytes- 00:09:22.689 #66 DONE cov: 12503 ft: 15689 corp: 35/1985b lim: 100 exec/s: 33 rss: 75Mb 00:09:22.689 ###### Recommended dictionary. ###### 00:09:22.689 "\000\000\000\000" # Uses: 1 00:09:22.689 "\034\000" # Uses: 4 00:09:22.689 ###### End of recommended dictionary. ###### 00:09:22.689 Done 66 runs in 2 second(s) 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:22.689 15:11:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:09:22.947 [2024-11-20 15:11:01.386504] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:22.947 [2024-11-20 15:11:01.386598] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1479837 ] 00:09:22.947 [2024-11-20 15:11:01.599068] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:22.947 [2024-11-20 15:11:01.613898] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.205 [2024-11-20 15:11:01.668136] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:23.205 [2024-11-20 15:11:01.684368] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:09:23.205 INFO: Running with entropic power schedule (0xFF, 100). 00:09:23.205 INFO: Seed: 2749264639 00:09:23.205 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:23.205 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:23.205 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:23.205 INFO: A corpus is not provided, starting from an empty corpus 00:09:23.205 #2 INITED exec/s: 0 rss: 66Mb 00:09:23.205 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:23.205 This may also happen if the target rejected all inputs we tried so far 00:09:23.205 [2024-11-20 15:11:01.739690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654388357959598 len:2649 00:09:23.205 [2024-11-20 15:11:01.739722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.463 NEW_FUNC[1/715]: 0x479db8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:09:23.463 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:23.463 #17 NEW cov: 12254 ft: 12234 corp: 2/11b lim: 50 exec/s: 0 rss: 73Mb L: 10/10 MS: 5 InsertByte-CopyPart-EraseBytes-ShuffleBytes-CMP- DE: "(*\217\256\244$E\000"- 00:09:23.463 [2024-11-20 15:11:02.080494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654388357959598 len:2137 00:09:23.463 [2024-11-20 15:11:02.080531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.463 #18 NEW cov: 12367 ft: 12810 corp: 3/21b lim: 50 exec/s: 0 rss: 74Mb L: 10/10 MS: 1 ChangeBit- 00:09:23.463 [2024-11-20 15:11:02.140605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2643125898289254308 len:2137 00:09:23.463 [2024-11-20 15:11:02.140635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.721 #19 NEW cov: 12373 ft: 13075 corp: 4/31b lim: 50 exec/s: 0 rss: 74Mb L: 10/10 MS: 1 ShuffleBytes- 00:09:23.721 [2024-11-20 15:11:02.200738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654388357959598 len:16393 00:09:23.721 [2024-11-20 15:11:02.200767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.721 #20 NEW cov: 12458 ft: 13430 corp: 5/42b lim: 50 exec/s: 0 rss: 74Mb L: 11/11 MS: 1 InsertByte- 00:09:23.721 [2024-11-20 15:11:02.240822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654388364552878 len:2649 00:09:23.721 [2024-11-20 15:11:02.240850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.721 #31 NEW cov: 12458 ft: 13539 corp: 6/52b lim: 50 exec/s: 0 rss: 74Mb L: 10/11 MS: 1 ShuffleBytes- 00:09:23.721 [2024-11-20 15:11:02.280961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654542976782254 len:17665 00:09:23.721 [2024-11-20 15:11:02.280989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.721 #32 NEW cov: 12458 ft: 13636 corp: 7/64b lim: 50 exec/s: 0 rss: 74Mb L: 12/12 MS: 1 CopyPart- 00:09:23.721 [2024-11-20 15:11:02.321161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12584223138225072783 len:43 00:09:23.721 [2024-11-20 15:11:02.321189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.721 [2024-11-20 15:11:02.321238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4982182956491514916 len:2137 00:09:23.721 [2024-11-20 15:11:02.321255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.721 #38 NEW cov: 12458 ft: 13992 corp: 8/84b lim: 50 exec/s: 0 rss: 74Mb L: 20/20 MS: 1 PersAutoDict- DE: "(*\217\256\244$E\000"- 00:09:23.721 [2024-11-20 15:11:02.381458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:09:23.721 [2024-11-20 15:11:02.381485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.721 [2024-11-20 15:11:02.381525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:09:23.721 [2024-11-20 15:11:02.381541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.721 [2024-11-20 15:11:02.381596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:09:23.721 [2024-11-20 15:11:02.381612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.721 #39 NEW cov: 12458 ft: 14292 corp: 9/122b lim: 50 exec/s: 0 rss: 74Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:09:23.980 [2024-11-20 15:11:02.421351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654542976782254 len:17665 00:09:23.980 [2024-11-20 15:11:02.421379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.980 #40 NEW cov: 12458 ft: 14349 corp: 10/134b lim: 50 exec/s: 0 rss: 74Mb L: 12/38 MS: 1 CrossOver- 00:09:23.980 [2024-11-20 15:11:02.461565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:579275503230534912 len:1 00:09:23.980 [2024-11-20 15:11:02.461593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.980 [2024-11-20 15:11:02.461640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:09:23.980 [2024-11-20 15:11:02.461656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.980 #42 NEW cov: 12458 ft: 14381 corp: 11/163b lim: 50 exec/s: 0 rss: 74Mb L: 29/38 MS: 2 EraseBytes-CrossOver- 00:09:23.980 [2024-11-20 15:11:02.501577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11866497935150623268 len:2649 00:09:23.980 [2024-11-20 15:11:02.501605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.980 #43 NEW cov: 12458 ft: 14401 corp: 12/173b lim: 50 exec/s: 0 rss: 74Mb L: 10/38 MS: 1 ShuffleBytes- 00:09:23.980 [2024-11-20 15:11:02.561753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654388364552878 len:11865 00:09:23.980 [2024-11-20 15:11:02.561781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.980 #44 NEW cov: 12458 ft: 14430 corp: 13/183b lim: 50 exec/s: 0 rss: 74Mb L: 10/38 MS: 1 ChangeByte- 00:09:23.980 [2024-11-20 15:11:02.602059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:09:23.980 [2024-11-20 15:11:02.602086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.980 [2024-11-20 15:11:02.602125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:09:23.980 [2024-11-20 15:11:02.602139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.980 [2024-11-20 15:11:02.602193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:09:23.980 [2024-11-20 15:11:02.602209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.980 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:23.980 #45 NEW cov: 12481 ft: 14528 corp: 14/221b lim: 50 exec/s: 0 rss: 74Mb L: 38/38 MS: 1 CopyPart- 00:09:23.980 [2024-11-20 15:11:02.662302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:09:23.980 [2024-11-20 15:11:02.662335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.980 [2024-11-20 15:11:02.662382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9851624184872960 len:1 00:09:23.980 [2024-11-20 15:11:02.662399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.980 [2024-11-20 15:11:02.662453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:09:23.980 [2024-11-20 15:11:02.662468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.238 #46 NEW cov: 12481 ft: 14583 corp: 15/259b lim: 50 exec/s: 0 rss: 74Mb L: 38/38 MS: 1 ChangeByte- 00:09:24.238 [2024-11-20 15:11:02.702151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654388357959598 len:10505 00:09:24.238 [2024-11-20 15:11:02.702180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.238 #47 NEW cov: 12481 ft: 14626 corp: 16/270b lim: 50 exec/s: 47 rss: 74Mb L: 11/38 MS: 1 ChangeByte- 00:09:24.238 [2024-11-20 15:11:02.762330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2643234749940404132 len:2137 00:09:24.238 [2024-11-20 15:11:02.762358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.238 #53 NEW cov: 12481 ft: 14654 corp: 17/280b lim: 50 exec/s: 53 rss: 74Mb L: 10/38 MS: 1 ChangeByte- 00:09:24.238 [2024-11-20 15:11:02.822698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:171966464000 len:10896 00:09:24.238 [2024-11-20 15:11:02.822724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.238 [2024-11-20 15:11:02.822769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9851627114865733 len:1 00:09:24.238 [2024-11-20 15:11:02.822785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.238 [2024-11-20 15:11:02.822838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:09:24.238 [2024-11-20 15:11:02.822854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.238 #54 NEW cov: 12481 ft: 14684 corp: 18/318b lim: 50 exec/s: 54 rss: 74Mb L: 38/38 MS: 1 PersAutoDict- DE: "(*\217\256\244$E\000"- 00:09:24.238 [2024-11-20 15:11:02.882877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:579275503230534912 len:1 00:09:24.238 [2024-11-20 15:11:02.882904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.238 [2024-11-20 15:11:02.882945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:16999940613005372395 len:60396 00:09:24.238 [2024-11-20 15:11:02.882960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.238 [2024-11-20 15:11:02.883013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:09:24.238 [2024-11-20 15:11:02.883029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.496 #55 NEW cov: 12481 ft: 14712 corp: 19/356b lim: 50 exec/s: 55 rss: 74Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:09:24.496 [2024-11-20 15:11:02.942932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827655341840699310 len:57055 00:09:24.496 [2024-11-20 15:11:02.942960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.496 [2024-11-20 15:11:02.943032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:16059518370053021406 len:57055 00:09:24.496 [2024-11-20 15:11:02.943049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.496 #56 NEW cov: 12481 ft: 14733 corp: 20/383b lim: 50 exec/s: 56 rss: 75Mb L: 27/38 MS: 1 InsertRepeatedBytes- 00:09:24.496 [2024-11-20 15:11:02.983156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:16999940616948018155 len:60396 00:09:24.496 [2024-11-20 15:11:02.983184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.496 [2024-11-20 15:11:02.983229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:16999940616948018155 len:60396 00:09:24.496 [2024-11-20 15:11:02.983245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.496 [2024-11-20 15:11:02.983298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:10316249919801518888 len:9286 00:09:24.496 [2024-11-20 15:11:02.983319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.496 #57 NEW cov: 12481 ft: 14735 corp: 21/416b lim: 50 exec/s: 57 rss: 75Mb L: 33/38 MS: 1 InsertRepeatedBytes- 00:09:24.496 [2024-11-20 15:11:03.023024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10353393068570978346 len:17665 00:09:24.496 [2024-11-20 15:11:03.023051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.496 #58 NEW cov: 12481 ft: 14777 corp: 22/426b lim: 50 exec/s: 58 rss: 75Mb L: 10/38 MS: 1 PersAutoDict- DE: "(*\217\256\244$E\000"- 00:09:24.496 [2024-11-20 15:11:03.063383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:171966464000 len:63888 00:09:24.496 [2024-11-20 15:11:03.063410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.496 [2024-11-20 15:11:03.063444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9851627114865733 len:1 00:09:24.496 [2024-11-20 15:11:03.063461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.496 [2024-11-20 15:11:03.063517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:09:24.496 [2024-11-20 15:11:03.063533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.496 #59 NEW cov: 12481 ft: 14845 corp: 23/464b lim: 50 exec/s: 59 rss: 75Mb L: 38/38 MS: 1 ChangeByte- 00:09:24.496 [2024-11-20 15:11:03.123357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2643126070087946148 len:10753 00:09:24.496 [2024-11-20 15:11:03.123386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.496 #60 NEW cov: 12481 ft: 14855 corp: 24/480b lim: 50 exec/s: 60 rss: 75Mb L: 16/38 MS: 1 CrossOver- 00:09:24.496 [2024-11-20 15:11:03.163418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13116778076252841984 len:36783 00:09:24.496 [2024-11-20 15:11:03.163444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.754 #63 NEW cov: 12481 ft: 14872 corp: 25/495b lim: 50 exec/s: 63 rss: 75Mb L: 15/38 MS: 3 EraseBytes-InsertByte-PersAutoDict- DE: "(*\217\256\244$E\000"- 00:09:24.754 [2024-11-20 15:11:03.223842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827654388357959598 len:10896 00:09:24.754 [2024-11-20 15:11:03.223868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.754 [2024-11-20 15:11:03.223914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9851627114865733 len:1 00:09:24.754 [2024-11-20 15:11:03.223930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.754 [2024-11-20 15:11:03.223982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:09:24.754 [2024-11-20 15:11:03.223998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.754 #64 NEW cov: 12481 ft: 14885 corp: 26/533b lim: 50 exec/s: 64 rss: 75Mb L: 38/38 MS: 1 CrossOver- 00:09:24.754 [2024-11-20 15:11:03.263843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2663360391170435620 len:44944 00:09:24.754 [2024-11-20 15:11:03.263870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.754 [2024-11-20 15:11:03.263922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2894125747652570693 len:22672 00:09:24.754 [2024-11-20 15:11:03.263938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.754 #68 NEW cov: 12481 ft: 14894 corp: 27/556b lim: 50 exec/s: 68 rss: 75Mb L: 23/38 MS: 4 EraseBytes-ChangeBinInt-CrossOver-CrossOver- 00:09:24.754 [2024-11-20 15:11:03.323925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:616751961440125102 len:17753 00:09:24.754 [2024-11-20 15:11:03.323955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.754 #69 NEW cov: 12481 ft: 14951 corp: 28/566b lim: 50 exec/s: 69 rss: 75Mb L: 10/38 MS: 1 ShuffleBytes- 00:09:24.754 [2024-11-20 15:11:03.364231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:171966464000 len:63888 00:09:24.754 [2024-11-20 15:11:03.364259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.754 [2024-11-20 15:11:03.364295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9851627114865733 len:1 00:09:24.754 [2024-11-20 15:11:03.364312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.754 [2024-11-20 15:11:03.364373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:72056494526300160 len:1 00:09:24.754 [2024-11-20 15:11:03.364393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.754 #70 NEW cov: 12481 ft: 14962 corp: 29/604b lim: 50 exec/s: 70 rss: 75Mb L: 38/38 MS: 1 ChangeBinInt- 00:09:24.754 [2024-11-20 15:11:03.424201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827770133431619352 len:10283 00:09:24.754 [2024-11-20 15:11:03.424228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.012 #71 NEW cov: 12481 ft: 14967 corp: 30/621b lim: 50 exec/s: 71 rss: 75Mb L: 17/38 MS: 1 InsertByte- 00:09:25.012 [2024-11-20 15:11:03.484592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827784130730037166 len:10896 00:09:25.012 [2024-11-20 15:11:03.484618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.012 [2024-11-20 15:11:03.484653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9851627114865733 len:1 00:09:25.012 [2024-11-20 15:11:03.484670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.012 [2024-11-20 15:11:03.484723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:09:25.012 [2024-11-20 15:11:03.484740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.012 #72 NEW cov: 12481 ft: 15005 corp: 31/659b lim: 50 exec/s: 72 rss: 75Mb L: 38/38 MS: 1 ChangeBinInt- 00:09:25.012 [2024-11-20 15:11:03.544538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3066745738246759598 len:10505 00:09:25.012 [2024-11-20 15:11:03.544567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.012 #73 NEW cov: 12481 ft: 15039 corp: 32/670b lim: 50 exec/s: 73 rss: 75Mb L: 11/38 MS: 1 ShuffleBytes- 00:09:25.012 [2024-11-20 15:11:03.604699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12584223138225072783 len:89 00:09:25.012 [2024-11-20 15:11:03.604728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.012 #74 NEW cov: 12481 ft: 15086 corp: 33/680b lim: 50 exec/s: 74 rss: 75Mb L: 10/38 MS: 1 PersAutoDict- DE: "(*\217\256\244$E\000"- 00:09:25.012 [2024-11-20 15:11:03.644858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12584223138275404431 len:89 00:09:25.012 [2024-11-20 15:11:03.644890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.012 #75 NEW cov: 12481 ft: 15163 corp: 34/690b lim: 50 exec/s: 75 rss: 75Mb L: 10/38 MS: 1 ChangeByte- 00:09:25.270 [2024-11-20 15:11:03.704914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11827622682909380376 len:36783 00:09:25.270 [2024-11-20 15:11:03.704942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.270 #76 NEW cov: 12481 ft: 15168 corp: 35/707b lim: 50 exec/s: 38 rss: 75Mb L: 17/38 MS: 1 PersAutoDict- DE: "(*\217\256\244$E\000"- 00:09:25.270 #76 DONE cov: 12481 ft: 15168 corp: 35/707b lim: 50 exec/s: 38 rss: 75Mb 00:09:25.270 ###### Recommended dictionary. ###### 00:09:25.270 "(*\217\256\244$E\000" # Uses: 7 00:09:25.270 ###### End of recommended dictionary. ###### 00:09:25.270 Done 76 runs in 2 second(s) 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:25.270 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:25.271 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:25.271 15:11:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:09:25.271 [2024-11-20 15:11:03.883688] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:25.271 [2024-11-20 15:11:03.883759] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1480190 ] 00:09:25.529 [2024-11-20 15:11:04.094226] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:25.529 [2024-11-20 15:11:04.108909] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:25.529 [2024-11-20 15:11:04.162064] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:25.529 [2024-11-20 15:11:04.178303] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:09:25.529 INFO: Running with entropic power schedule (0xFF, 100). 00:09:25.529 INFO: Seed: 949291046 00:09:25.529 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:25.529 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:25.529 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:25.529 INFO: A corpus is not provided, starting from an empty corpus 00:09:25.529 #2 INITED exec/s: 0 rss: 66Mb 00:09:25.529 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:25.529 This may also happen if the target rejected all inputs we tried so far 00:09:25.787 [2024-11-20 15:11:04.223264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:25.787 [2024-11-20 15:11:04.223300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.787 [2024-11-20 15:11:04.223343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:25.787 [2024-11-20 15:11:04.223361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.787 [2024-11-20 15:11:04.223393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:25.787 [2024-11-20 15:11:04.223414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.045 NEW_FUNC[1/717]: 0x47b978 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:09:26.045 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:26.045 #4 NEW cov: 12312 ft: 12311 corp: 2/61b lim: 90 exec/s: 0 rss: 73Mb L: 60/60 MS: 2 ChangeBit-InsertRepeatedBytes- 00:09:26.045 [2024-11-20 15:11:04.584141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.045 [2024-11-20 15:11:04.584184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.045 [2024-11-20 15:11:04.584236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.045 [2024-11-20 15:11:04.584254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.045 [2024-11-20 15:11:04.584284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:26.045 [2024-11-20 15:11:04.584301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.045 #11 NEW cov: 12425 ft: 12931 corp: 3/122b lim: 90 exec/s: 0 rss: 73Mb L: 61/61 MS: 2 CopyPart-CrossOver- 00:09:26.045 [2024-11-20 15:11:04.644093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.045 [2024-11-20 15:11:04.644124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.045 [2024-11-20 15:11:04.644174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.045 [2024-11-20 15:11:04.644193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.045 #12 NEW cov: 12431 ft: 13530 corp: 4/167b lim: 90 exec/s: 0 rss: 73Mb L: 45/61 MS: 1 EraseBytes- 00:09:26.303 [2024-11-20 15:11:04.744441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.303 [2024-11-20 15:11:04.744473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.303 [2024-11-20 15:11:04.744506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.303 [2024-11-20 15:11:04.744529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.303 [2024-11-20 15:11:04.744559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:26.303 [2024-11-20 15:11:04.744575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.303 #13 NEW cov: 12516 ft: 13844 corp: 5/228b lim: 90 exec/s: 0 rss: 74Mb L: 61/61 MS: 1 ChangeBinInt- 00:09:26.303 [2024-11-20 15:11:04.844571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.303 [2024-11-20 15:11:04.844602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.303 #14 NEW cov: 12516 ft: 14670 corp: 6/255b lim: 90 exec/s: 0 rss: 74Mb L: 27/61 MS: 1 EraseBytes- 00:09:26.303 [2024-11-20 15:11:04.944977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.303 [2024-11-20 15:11:04.945009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.303 [2024-11-20 15:11:04.945043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.303 [2024-11-20 15:11:04.945067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.304 [2024-11-20 15:11:04.945102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:26.304 [2024-11-20 15:11:04.945119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.561 #15 NEW cov: 12516 ft: 14780 corp: 7/316b lim: 90 exec/s: 0 rss: 74Mb L: 61/61 MS: 1 CrossOver- 00:09:26.561 [2024-11-20 15:11:05.005023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.561 [2024-11-20 15:11:05.005053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.562 [2024-11-20 15:11:05.005103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.562 [2024-11-20 15:11:05.005121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.562 #16 NEW cov: 12516 ft: 14814 corp: 8/361b lim: 90 exec/s: 0 rss: 74Mb L: 45/61 MS: 1 CrossOver- 00:09:26.562 [2024-11-20 15:11:05.065287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.562 [2024-11-20 15:11:05.065324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.562 [2024-11-20 15:11:05.065360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.562 [2024-11-20 15:11:05.065378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.562 [2024-11-20 15:11:05.065409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:26.562 [2024-11-20 15:11:05.065425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.562 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:26.562 #17 NEW cov: 12533 ft: 14992 corp: 9/423b lim: 90 exec/s: 0 rss: 74Mb L: 62/62 MS: 1 InsertByte- 00:09:26.562 [2024-11-20 15:11:05.155507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.562 [2024-11-20 15:11:05.155538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.562 [2024-11-20 15:11:05.155572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.562 [2024-11-20 15:11:05.155595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.562 [2024-11-20 15:11:05.155642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:26.562 [2024-11-20 15:11:05.155659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.562 #18 NEW cov: 12533 ft: 15068 corp: 10/484b lim: 90 exec/s: 18 rss: 74Mb L: 61/62 MS: 1 ShuffleBytes- 00:09:26.562 [2024-11-20 15:11:05.245794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.562 [2024-11-20 15:11:05.245825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.562 [2024-11-20 15:11:05.245860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.562 [2024-11-20 15:11:05.245878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.562 [2024-11-20 15:11:05.245910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:26.562 [2024-11-20 15:11:05.245928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.820 #19 NEW cov: 12533 ft: 15141 corp: 11/545b lim: 90 exec/s: 19 rss: 74Mb L: 61/62 MS: 1 ShuffleBytes- 00:09:26.820 [2024-11-20 15:11:05.305788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.820 [2024-11-20 15:11:05.305818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.820 #20 NEW cov: 12533 ft: 15222 corp: 12/569b lim: 90 exec/s: 20 rss: 74Mb L: 24/62 MS: 1 EraseBytes- 00:09:26.820 [2024-11-20 15:11:05.406174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.820 [2024-11-20 15:11:05.406205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.820 [2024-11-20 15:11:05.406240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.820 [2024-11-20 15:11:05.406258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.820 [2024-11-20 15:11:05.406289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:26.820 [2024-11-20 15:11:05.406331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.820 #21 NEW cov: 12533 ft: 15259 corp: 13/630b lim: 90 exec/s: 21 rss: 74Mb L: 61/62 MS: 1 ChangeBinInt- 00:09:26.820 [2024-11-20 15:11:05.466349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:26.820 [2024-11-20 15:11:05.466380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.820 [2024-11-20 15:11:05.466429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:26.820 [2024-11-20 15:11:05.466447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.820 [2024-11-20 15:11:05.466490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:26.820 [2024-11-20 15:11:05.466507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.078 #22 NEW cov: 12533 ft: 15298 corp: 14/691b lim: 90 exec/s: 22 rss: 74Mb L: 61/62 MS: 1 ShuffleBytes- 00:09:27.078 [2024-11-20 15:11:05.526913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.078 [2024-11-20 15:11:05.526941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.078 #23 NEW cov: 12533 ft: 15466 corp: 15/718b lim: 90 exec/s: 23 rss: 74Mb L: 27/62 MS: 1 ShuffleBytes- 00:09:27.078 [2024-11-20 15:11:05.567210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.078 [2024-11-20 15:11:05.567236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.078 [2024-11-20 15:11:05.567290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.078 [2024-11-20 15:11:05.567307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.078 #29 NEW cov: 12533 ft: 15521 corp: 16/759b lim: 90 exec/s: 29 rss: 74Mb L: 41/62 MS: 1 EraseBytes- 00:09:27.078 [2024-11-20 15:11:05.607162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.078 [2024-11-20 15:11:05.607190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.078 #35 NEW cov: 12533 ft: 15551 corp: 17/786b lim: 90 exec/s: 35 rss: 74Mb L: 27/62 MS: 1 ChangeBit- 00:09:27.078 [2024-11-20 15:11:05.667443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.078 [2024-11-20 15:11:05.667477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.078 [2024-11-20 15:11:05.667530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.078 [2024-11-20 15:11:05.667547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.078 #36 NEW cov: 12533 ft: 15562 corp: 18/831b lim: 90 exec/s: 36 rss: 74Mb L: 45/62 MS: 1 ChangeBit- 00:09:27.078 [2024-11-20 15:11:05.707566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.078 [2024-11-20 15:11:05.707593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.078 [2024-11-20 15:11:05.707646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.078 [2024-11-20 15:11:05.707663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.078 #37 NEW cov: 12533 ft: 15622 corp: 19/876b lim: 90 exec/s: 37 rss: 74Mb L: 45/62 MS: 1 ChangeByte- 00:09:27.336 [2024-11-20 15:11:05.767898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.336 [2024-11-20 15:11:05.767924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.336 [2024-11-20 15:11:05.767970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.336 [2024-11-20 15:11:05.767986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.336 [2024-11-20 15:11:05.768043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.336 [2024-11-20 15:11:05.768059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.336 #38 NEW cov: 12533 ft: 15642 corp: 20/937b lim: 90 exec/s: 38 rss: 74Mb L: 61/62 MS: 1 ChangeByte- 00:09:27.336 [2024-11-20 15:11:05.828047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.336 [2024-11-20 15:11:05.828074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.336 [2024-11-20 15:11:05.828140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.336 [2024-11-20 15:11:05.828157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.336 [2024-11-20 15:11:05.828215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.336 [2024-11-20 15:11:05.828231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.336 #39 NEW cov: 12533 ft: 15722 corp: 21/998b lim: 90 exec/s: 39 rss: 74Mb L: 61/62 MS: 1 CopyPart- 00:09:27.336 [2024-11-20 15:11:05.867881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.336 [2024-11-20 15:11:05.867909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.336 #40 NEW cov: 12533 ft: 15730 corp: 22/1025b lim: 90 exec/s: 40 rss: 74Mb L: 27/62 MS: 1 ChangeByte- 00:09:27.336 [2024-11-20 15:11:05.928329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.336 [2024-11-20 15:11:05.928372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.336 [2024-11-20 15:11:05.928423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.337 [2024-11-20 15:11:05.928443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.337 [2024-11-20 15:11:05.928498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.337 [2024-11-20 15:11:05.928514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.337 #41 NEW cov: 12533 ft: 15771 corp: 23/1086b lim: 90 exec/s: 41 rss: 74Mb L: 61/62 MS: 1 ChangeByte- 00:09:27.337 [2024-11-20 15:11:05.988181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.337 [2024-11-20 15:11:05.988209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.337 #42 NEW cov: 12533 ft: 15778 corp: 24/1113b lim: 90 exec/s: 42 rss: 74Mb L: 27/62 MS: 1 InsertRepeatedBytes- 00:09:27.594 [2024-11-20 15:11:06.028329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.594 [2024-11-20 15:11:06.028358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.594 #43 NEW cov: 12533 ft: 15813 corp: 25/1141b lim: 90 exec/s: 43 rss: 74Mb L: 28/62 MS: 1 InsertByte- 00:09:27.594 [2024-11-20 15:11:06.068433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.594 [2024-11-20 15:11:06.068463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.594 #44 NEW cov: 12533 ft: 15914 corp: 26/1163b lim: 90 exec/s: 44 rss: 75Mb L: 22/62 MS: 1 EraseBytes- 00:09:27.594 [2024-11-20 15:11:06.128672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.594 [2024-11-20 15:11:06.128701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.594 #45 NEW cov: 12540 ft: 15938 corp: 27/1191b lim: 90 exec/s: 45 rss: 75Mb L: 28/62 MS: 1 InsertByte- 00:09:27.594 [2024-11-20 15:11:06.168740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.594 [2024-11-20 15:11:06.168767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.594 #46 NEW cov: 12540 ft: 15957 corp: 28/1218b lim: 90 exec/s: 23 rss: 75Mb L: 27/62 MS: 1 ChangeByte- 00:09:27.594 #46 DONE cov: 12540 ft: 15957 corp: 28/1218b lim: 90 exec/s: 23 rss: 75Mb 00:09:27.594 Done 46 runs in 2 second(s) 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:27.853 15:11:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:09:27.853 [2024-11-20 15:11:06.354205] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:27.853 [2024-11-20 15:11:06.354274] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1480536 ] 00:09:28.112 [2024-11-20 15:11:06.563343] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.112 [2024-11-20 15:11:06.578056] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:28.112 [2024-11-20 15:11:06.630915] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:28.112 [2024-11-20 15:11:06.647145] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:09:28.112 INFO: Running with entropic power schedule (0xFF, 100). 00:09:28.112 INFO: Seed: 3416286779 00:09:28.112 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:28.112 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:28.112 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:28.112 INFO: A corpus is not provided, starting from an empty corpus 00:09:28.112 #2 INITED exec/s: 0 rss: 66Mb 00:09:28.112 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:28.112 This may also happen if the target rejected all inputs we tried so far 00:09:28.112 [2024-11-20 15:11:06.695687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.112 [2024-11-20 15:11:06.695718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.370 NEW_FUNC[1/717]: 0x47eba8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:09:28.370 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:28.370 #11 NEW cov: 12286 ft: 12265 corp: 2/14b lim: 50 exec/s: 0 rss: 73Mb L: 13/13 MS: 4 ChangeByte-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:09:28.370 [2024-11-20 15:11:07.037068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.370 [2024-11-20 15:11:07.037107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.370 [2024-11-20 15:11:07.037156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:28.370 [2024-11-20 15:11:07.037173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.370 [2024-11-20 15:11:07.037226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:28.370 [2024-11-20 15:11:07.037240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.370 [2024-11-20 15:11:07.037293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:28.370 [2024-11-20 15:11:07.037312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:28.628 #13 NEW cov: 12400 ft: 13668 corp: 3/58b lim: 50 exec/s: 0 rss: 74Mb L: 44/44 MS: 2 CopyPart-InsertRepeatedBytes- 00:09:28.628 [2024-11-20 15:11:07.086695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.628 [2024-11-20 15:11:07.086725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.629 #17 NEW cov: 12406 ft: 14047 corp: 4/68b lim: 50 exec/s: 0 rss: 74Mb L: 10/44 MS: 4 ShuffleBytes-InsertRepeatedBytes-ChangeBinInt-InsertByte- 00:09:28.629 [2024-11-20 15:11:07.126789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.629 [2024-11-20 15:11:07.126818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.629 #19 NEW cov: 12491 ft: 14303 corp: 5/79b lim: 50 exec/s: 0 rss: 74Mb L: 11/44 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:28.629 [2024-11-20 15:11:07.166860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.629 [2024-11-20 15:11:07.166888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.629 #20 NEW cov: 12491 ft: 14464 corp: 6/90b lim: 50 exec/s: 0 rss: 74Mb L: 11/44 MS: 1 EraseBytes- 00:09:28.629 [2024-11-20 15:11:07.227089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.629 [2024-11-20 15:11:07.227117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.629 #21 NEW cov: 12491 ft: 14581 corp: 7/101b lim: 50 exec/s: 0 rss: 74Mb L: 11/44 MS: 1 ChangeBit- 00:09:28.629 [2024-11-20 15:11:07.287380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.629 [2024-11-20 15:11:07.287407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.629 [2024-11-20 15:11:07.287445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:28.629 [2024-11-20 15:11:07.287461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.629 #22 NEW cov: 12491 ft: 14950 corp: 8/122b lim: 50 exec/s: 0 rss: 74Mb L: 21/44 MS: 1 CrossOver- 00:09:28.887 [2024-11-20 15:11:07.327412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.887 [2024-11-20 15:11:07.327440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.887 #23 NEW cov: 12491 ft: 14971 corp: 9/134b lim: 50 exec/s: 0 rss: 74Mb L: 12/44 MS: 1 InsertByte- 00:09:28.887 [2024-11-20 15:11:07.387558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.887 [2024-11-20 15:11:07.387586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.887 #24 NEW cov: 12491 ft: 15013 corp: 10/151b lim: 50 exec/s: 0 rss: 74Mb L: 17/44 MS: 1 CrossOver- 00:09:28.887 [2024-11-20 15:11:07.447719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.887 [2024-11-20 15:11:07.447748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.887 #25 NEW cov: 12491 ft: 15068 corp: 11/164b lim: 50 exec/s: 0 rss: 74Mb L: 13/44 MS: 1 CopyPart- 00:09:28.887 [2024-11-20 15:11:07.487994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.887 [2024-11-20 15:11:07.488021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.887 [2024-11-20 15:11:07.488079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:28.887 [2024-11-20 15:11:07.488096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.887 #26 NEW cov: 12491 ft: 15130 corp: 12/185b lim: 50 exec/s: 0 rss: 74Mb L: 21/44 MS: 1 ChangeBit- 00:09:28.887 [2024-11-20 15:11:07.547983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:28.887 [2024-11-20 15:11:07.548011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.146 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:29.146 #27 NEW cov: 12514 ft: 15151 corp: 13/198b lim: 50 exec/s: 0 rss: 74Mb L: 13/44 MS: 1 ChangeBinInt- 00:09:29.146 [2024-11-20 15:11:07.608130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.146 [2024-11-20 15:11:07.608158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.146 #29 NEW cov: 12514 ft: 15165 corp: 14/216b lim: 50 exec/s: 0 rss: 74Mb L: 18/44 MS: 2 CrossOver-InsertRepeatedBytes- 00:09:29.146 [2024-11-20 15:11:07.648711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.146 [2024-11-20 15:11:07.648739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.146 [2024-11-20 15:11:07.648792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.146 [2024-11-20 15:11:07.648810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.146 [2024-11-20 15:11:07.648861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:29.146 [2024-11-20 15:11:07.648877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.146 [2024-11-20 15:11:07.648933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:29.146 [2024-11-20 15:11:07.648950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:29.146 #30 NEW cov: 12514 ft: 15183 corp: 15/261b lim: 50 exec/s: 30 rss: 74Mb L: 45/45 MS: 1 InsertByte- 00:09:29.146 [2024-11-20 15:11:07.708438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.146 [2024-11-20 15:11:07.708466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.146 #33 NEW cov: 12514 ft: 15237 corp: 16/273b lim: 50 exec/s: 33 rss: 74Mb L: 12/45 MS: 3 EraseBytes-ChangeBinInt-CopyPart- 00:09:29.146 [2024-11-20 15:11:07.748516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.146 [2024-11-20 15:11:07.748545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.146 #34 NEW cov: 12514 ft: 15311 corp: 17/284b lim: 50 exec/s: 34 rss: 74Mb L: 11/45 MS: 1 ShuffleBytes- 00:09:29.146 [2024-11-20 15:11:07.788795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.146 [2024-11-20 15:11:07.788822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.146 [2024-11-20 15:11:07.788859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.146 [2024-11-20 15:11:07.788875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.403 #35 NEW cov: 12514 ft: 15315 corp: 18/305b lim: 50 exec/s: 35 rss: 74Mb L: 21/45 MS: 1 ShuffleBytes- 00:09:29.403 [2024-11-20 15:11:07.848881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.403 [2024-11-20 15:11:07.848909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.403 #36 NEW cov: 12514 ft: 15321 corp: 19/319b lim: 50 exec/s: 36 rss: 74Mb L: 14/45 MS: 1 InsertByte- 00:09:29.403 [2024-11-20 15:11:07.889063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.403 [2024-11-20 15:11:07.889090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.403 [2024-11-20 15:11:07.889129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.403 [2024-11-20 15:11:07.889145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.403 #37 NEW cov: 12514 ft: 15333 corp: 20/341b lim: 50 exec/s: 37 rss: 74Mb L: 22/45 MS: 1 InsertByte- 00:09:29.403 [2024-11-20 15:11:07.949266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.403 [2024-11-20 15:11:07.949292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.403 [2024-11-20 15:11:07.949332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.403 [2024-11-20 15:11:07.949365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.403 #38 NEW cov: 12514 ft: 15360 corp: 21/363b lim: 50 exec/s: 38 rss: 75Mb L: 22/45 MS: 1 ShuffleBytes- 00:09:29.404 [2024-11-20 15:11:08.009292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.404 [2024-11-20 15:11:08.009324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.404 #39 NEW cov: 12514 ft: 15362 corp: 22/377b lim: 50 exec/s: 39 rss: 75Mb L: 14/45 MS: 1 InsertByte- 00:09:29.404 [2024-11-20 15:11:08.049571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.404 [2024-11-20 15:11:08.049598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.404 [2024-11-20 15:11:08.049637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.404 [2024-11-20 15:11:08.049652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.404 #40 NEW cov: 12514 ft: 15418 corp: 23/399b lim: 50 exec/s: 40 rss: 75Mb L: 22/45 MS: 1 InsertByte- 00:09:29.662 [2024-11-20 15:11:08.089650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.662 [2024-11-20 15:11:08.089676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.662 [2024-11-20 15:11:08.089712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.662 [2024-11-20 15:11:08.089727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.662 #41 NEW cov: 12514 ft: 15467 corp: 24/425b lim: 50 exec/s: 41 rss: 75Mb L: 26/45 MS: 1 CopyPart- 00:09:29.662 [2024-11-20 15:11:08.149708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.662 [2024-11-20 15:11:08.149736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.662 #42 NEW cov: 12514 ft: 15489 corp: 25/436b lim: 50 exec/s: 42 rss: 75Mb L: 11/45 MS: 1 ChangeBinInt- 00:09:29.662 [2024-11-20 15:11:08.210019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.662 [2024-11-20 15:11:08.210050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.662 [2024-11-20 15:11:08.210093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.662 [2024-11-20 15:11:08.210107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.662 #43 NEW cov: 12514 ft: 15514 corp: 26/457b lim: 50 exec/s: 43 rss: 75Mb L: 21/45 MS: 1 ChangeBinInt- 00:09:29.662 [2024-11-20 15:11:08.249954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.662 [2024-11-20 15:11:08.249982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.662 #44 NEW cov: 12514 ft: 15525 corp: 27/475b lim: 50 exec/s: 44 rss: 75Mb L: 18/45 MS: 1 ChangeBinInt- 00:09:29.662 [2024-11-20 15:11:08.310619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.662 [2024-11-20 15:11:08.310646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.662 [2024-11-20 15:11:08.310695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.662 [2024-11-20 15:11:08.310711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.662 [2024-11-20 15:11:08.310765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:29.662 [2024-11-20 15:11:08.310780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.662 [2024-11-20 15:11:08.310833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:29.662 [2024-11-20 15:11:08.310849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:29.662 #45 NEW cov: 12514 ft: 15546 corp: 28/520b lim: 50 exec/s: 45 rss: 75Mb L: 45/45 MS: 1 InsertByte- 00:09:29.922 [2024-11-20 15:11:08.350416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.922 [2024-11-20 15:11:08.350443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.922 [2024-11-20 15:11:08.350478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.922 [2024-11-20 15:11:08.350494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.922 #46 NEW cov: 12514 ft: 15597 corp: 29/544b lim: 50 exec/s: 46 rss: 75Mb L: 24/45 MS: 1 CMP- DE: "\000\037"- 00:09:29.922 [2024-11-20 15:11:08.390361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.922 [2024-11-20 15:11:08.390388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.922 #47 NEW cov: 12514 ft: 15606 corp: 30/557b lim: 50 exec/s: 47 rss: 75Mb L: 13/45 MS: 1 CopyPart- 00:09:29.922 [2024-11-20 15:11:08.430791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.922 [2024-11-20 15:11:08.430818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.922 [2024-11-20 15:11:08.430860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:29.922 [2024-11-20 15:11:08.430875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.922 [2024-11-20 15:11:08.430930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:29.922 [2024-11-20 15:11:08.430950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.922 #48 NEW cov: 12514 ft: 15848 corp: 31/589b lim: 50 exec/s: 48 rss: 75Mb L: 32/45 MS: 1 CrossOver- 00:09:29.922 [2024-11-20 15:11:08.470609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.922 [2024-11-20 15:11:08.470636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.922 #49 NEW cov: 12514 ft: 15878 corp: 32/603b lim: 50 exec/s: 49 rss: 75Mb L: 14/45 MS: 1 InsertByte- 00:09:29.922 [2024-11-20 15:11:08.530771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.922 [2024-11-20 15:11:08.530797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.922 #50 NEW cov: 12514 ft: 15920 corp: 33/614b lim: 50 exec/s: 50 rss: 75Mb L: 11/45 MS: 1 ChangeBit- 00:09:29.922 [2024-11-20 15:11:08.590930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.922 [2024-11-20 15:11:08.590957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.181 #51 NEW cov: 12514 ft: 15964 corp: 34/628b lim: 50 exec/s: 51 rss: 75Mb L: 14/45 MS: 1 ChangeBinInt- 00:09:30.181 [2024-11-20 15:11:08.651085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.181 [2024-11-20 15:11:08.651113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.181 #52 NEW cov: 12514 ft: 16008 corp: 35/639b lim: 50 exec/s: 52 rss: 75Mb L: 11/45 MS: 1 CrossOver- 00:09:30.182 [2024-11-20 15:11:08.691322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.182 [2024-11-20 15:11:08.691349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.182 [2024-11-20 15:11:08.691407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.182 [2024-11-20 15:11:08.691422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.182 #53 NEW cov: 12514 ft: 16077 corp: 36/661b lim: 50 exec/s: 26 rss: 75Mb L: 22/45 MS: 1 PersAutoDict- DE: "\000\037"- 00:09:30.182 #53 DONE cov: 12514 ft: 16077 corp: 36/661b lim: 50 exec/s: 26 rss: 75Mb 00:09:30.182 ###### Recommended dictionary. ###### 00:09:30.182 "\000\037" # Uses: 1 00:09:30.182 ###### End of recommended dictionary. ###### 00:09:30.182 Done 53 runs in 2 second(s) 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:30.182 15:11:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:09:30.182 [2024-11-20 15:11:08.862410] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:30.182 [2024-11-20 15:11:08.862496] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1480812 ] 00:09:30.442 [2024-11-20 15:11:09.065272] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.442 [2024-11-20 15:11:09.080253] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.709 [2024-11-20 15:11:09.133151] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:30.709 [2024-11-20 15:11:09.149395] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:09:30.709 INFO: Running with entropic power schedule (0xFF, 100). 00:09:30.709 INFO: Seed: 1625329068 00:09:30.709 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:30.709 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:30.709 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:30.709 INFO: A corpus is not provided, starting from an empty corpus 00:09:30.709 #2 INITED exec/s: 0 rss: 66Mb 00:09:30.709 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:30.709 This may also happen if the target rejected all inputs we tried so far 00:09:30.709 [2024-11-20 15:11:09.204874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:30.709 [2024-11-20 15:11:09.204907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.709 [2024-11-20 15:11:09.204966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:30.709 [2024-11-20 15:11:09.204983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.973 NEW_FUNC[1/717]: 0x480e78 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:09:30.973 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:30.973 #8 NEW cov: 12313 ft: 12312 corp: 2/44b lim: 85 exec/s: 0 rss: 73Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:09:30.973 [2024-11-20 15:11:09.535907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:30.973 [2024-11-20 15:11:09.535943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.973 [2024-11-20 15:11:09.535998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:30.973 [2024-11-20 15:11:09.536015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.973 [2024-11-20 15:11:09.536071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:30.973 [2024-11-20 15:11:09.536090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.973 #10 NEW cov: 12426 ft: 13220 corp: 3/107b lim: 85 exec/s: 0 rss: 74Mb L: 63/63 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:09:30.973 [2024-11-20 15:11:09.576060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:30.973 [2024-11-20 15:11:09.576089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.973 [2024-11-20 15:11:09.576136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:30.973 [2024-11-20 15:11:09.576152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.973 [2024-11-20 15:11:09.576204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:30.973 [2024-11-20 15:11:09.576220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.973 [2024-11-20 15:11:09.576274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:30.973 [2024-11-20 15:11:09.576290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.973 #13 NEW cov: 12432 ft: 13803 corp: 4/179b lim: 85 exec/s: 0 rss: 74Mb L: 72/72 MS: 3 InsertByte-CopyPart-InsertRepeatedBytes- 00:09:30.973 [2024-11-20 15:11:09.615875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:30.973 [2024-11-20 15:11:09.615903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.973 [2024-11-20 15:11:09.615948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:30.973 [2024-11-20 15:11:09.615964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.233 #14 NEW cov: 12517 ft: 14172 corp: 5/222b lim: 85 exec/s: 0 rss: 74Mb L: 43/72 MS: 1 CopyPart- 00:09:31.233 [2024-11-20 15:11:09.676330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.233 [2024-11-20 15:11:09.676359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.676413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.233 [2024-11-20 15:11:09.676429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.676483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:31.233 [2024-11-20 15:11:09.676500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.676555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:31.233 [2024-11-20 15:11:09.676571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.233 #15 NEW cov: 12517 ft: 14335 corp: 6/294b lim: 85 exec/s: 0 rss: 74Mb L: 72/72 MS: 1 CopyPart- 00:09:31.233 [2024-11-20 15:11:09.736504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.233 [2024-11-20 15:11:09.736530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.736578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.233 [2024-11-20 15:11:09.736594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.736652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:31.233 [2024-11-20 15:11:09.736669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.736724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:31.233 [2024-11-20 15:11:09.736741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.233 #16 NEW cov: 12517 ft: 14392 corp: 7/366b lim: 85 exec/s: 0 rss: 74Mb L: 72/72 MS: 1 ChangeBinInt- 00:09:31.233 [2024-11-20 15:11:09.776309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.233 [2024-11-20 15:11:09.776341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.776381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.233 [2024-11-20 15:11:09.776397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.233 #17 NEW cov: 12517 ft: 14546 corp: 8/401b lim: 85 exec/s: 0 rss: 74Mb L: 35/72 MS: 1 EraseBytes- 00:09:31.233 [2024-11-20 15:11:09.836443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.233 [2024-11-20 15:11:09.836470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.836509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.233 [2024-11-20 15:11:09.836524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.233 #18 NEW cov: 12517 ft: 14651 corp: 9/435b lim: 85 exec/s: 0 rss: 74Mb L: 34/72 MS: 1 EraseBytes- 00:09:31.233 [2024-11-20 15:11:09.876593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.233 [2024-11-20 15:11:09.876620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.233 [2024-11-20 15:11:09.876658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.233 [2024-11-20 15:11:09.876674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.492 #19 NEW cov: 12517 ft: 14742 corp: 10/469b lim: 85 exec/s: 0 rss: 74Mb L: 34/72 MS: 1 ChangeBinInt- 00:09:31.492 [2024-11-20 15:11:09.937062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.492 [2024-11-20 15:11:09.937090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:09.937145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.492 [2024-11-20 15:11:09.937161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:09.937215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:31.492 [2024-11-20 15:11:09.937231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:09.937286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:31.492 [2024-11-20 15:11:09.937302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.492 #20 NEW cov: 12517 ft: 14796 corp: 11/541b lim: 85 exec/s: 0 rss: 74Mb L: 72/72 MS: 1 ChangeBinInt- 00:09:31.492 [2024-11-20 15:11:09.997237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.492 [2024-11-20 15:11:09.997265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:09.997324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.492 [2024-11-20 15:11:09.997356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:09.997423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:31.492 [2024-11-20 15:11:09.997440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:09.997495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:31.492 [2024-11-20 15:11:09.997511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.492 #21 NEW cov: 12517 ft: 14826 corp: 12/613b lim: 85 exec/s: 0 rss: 74Mb L: 72/72 MS: 1 ShuffleBytes- 00:09:31.492 [2024-11-20 15:11:10.057177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.492 [2024-11-20 15:11:10.057215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:10.057271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.492 [2024-11-20 15:11:10.057287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.492 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:31.492 #22 NEW cov: 12540 ft: 14906 corp: 13/649b lim: 85 exec/s: 0 rss: 74Mb L: 36/72 MS: 1 InsertByte- 00:09:31.492 [2024-11-20 15:11:10.117470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.492 [2024-11-20 15:11:10.117506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:10.117549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.492 [2024-11-20 15:11:10.117565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.492 [2024-11-20 15:11:10.117621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:31.492 [2024-11-20 15:11:10.117637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.492 #23 NEW cov: 12540 ft: 14954 corp: 14/712b lim: 85 exec/s: 0 rss: 74Mb L: 63/72 MS: 1 ChangeBit- 00:09:31.751 [2024-11-20 15:11:10.177479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.752 [2024-11-20 15:11:10.177511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.177563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.752 [2024-11-20 15:11:10.177580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.752 #24 NEW cov: 12540 ft: 14964 corp: 15/748b lim: 85 exec/s: 24 rss: 74Mb L: 36/72 MS: 1 CopyPart- 00:09:31.752 [2024-11-20 15:11:10.237939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.752 [2024-11-20 15:11:10.237967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.238010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.752 [2024-11-20 15:11:10.238025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.238080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:31.752 [2024-11-20 15:11:10.238097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.238151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:31.752 [2024-11-20 15:11:10.238168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.752 #25 NEW cov: 12540 ft: 15050 corp: 16/820b lim: 85 exec/s: 25 rss: 74Mb L: 72/72 MS: 1 ChangeBit- 00:09:31.752 [2024-11-20 15:11:10.277692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.752 [2024-11-20 15:11:10.277720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.277758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.752 [2024-11-20 15:11:10.277775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.752 #26 NEW cov: 12540 ft: 15069 corp: 17/855b lim: 85 exec/s: 26 rss: 74Mb L: 35/72 MS: 1 ChangeByte- 00:09:31.752 [2024-11-20 15:11:10.317982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.752 [2024-11-20 15:11:10.318009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.318057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.752 [2024-11-20 15:11:10.318073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.318131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:31.752 [2024-11-20 15:11:10.318147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.752 #27 NEW cov: 12540 ft: 15122 corp: 18/914b lim: 85 exec/s: 27 rss: 75Mb L: 59/72 MS: 1 EraseBytes- 00:09:31.752 [2024-11-20 15:11:10.378153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.752 [2024-11-20 15:11:10.378180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.378244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.752 [2024-11-20 15:11:10.378260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.378319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:31.752 [2024-11-20 15:11:10.378336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.752 #28 NEW cov: 12540 ft: 15138 corp: 19/977b lim: 85 exec/s: 28 rss: 75Mb L: 63/72 MS: 1 ShuffleBytes- 00:09:31.752 [2024-11-20 15:11:10.418138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:31.752 [2024-11-20 15:11:10.418166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.752 [2024-11-20 15:11:10.418215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:31.752 [2024-11-20 15:11:10.418235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.011 #29 NEW cov: 12540 ft: 15139 corp: 20/1014b lim: 85 exec/s: 29 rss: 75Mb L: 37/72 MS: 1 InsertByte- 00:09:32.011 [2024-11-20 15:11:10.478291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.011 [2024-11-20 15:11:10.478320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.011 [2024-11-20 15:11:10.478378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.011 [2024-11-20 15:11:10.478394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.012 #30 NEW cov: 12540 ft: 15208 corp: 21/1051b lim: 85 exec/s: 30 rss: 75Mb L: 37/72 MS: 1 ShuffleBytes- 00:09:32.012 [2024-11-20 15:11:10.538442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.012 [2024-11-20 15:11:10.538469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.012 [2024-11-20 15:11:10.538507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.012 [2024-11-20 15:11:10.538523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.012 #36 NEW cov: 12540 ft: 15227 corp: 22/1087b lim: 85 exec/s: 36 rss: 75Mb L: 36/72 MS: 1 InsertByte- 00:09:32.012 [2024-11-20 15:11:10.578593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.012 [2024-11-20 15:11:10.578620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.012 [2024-11-20 15:11:10.578669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.012 [2024-11-20 15:11:10.578685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.012 #37 NEW cov: 12540 ft: 15236 corp: 23/1121b lim: 85 exec/s: 37 rss: 75Mb L: 34/72 MS: 1 CrossOver- 00:09:32.012 [2024-11-20 15:11:10.638880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.012 [2024-11-20 15:11:10.638906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.012 [2024-11-20 15:11:10.638947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.012 [2024-11-20 15:11:10.638963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.012 [2024-11-20 15:11:10.639018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.012 [2024-11-20 15:11:10.639034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.012 #38 NEW cov: 12540 ft: 15281 corp: 24/1185b lim: 85 exec/s: 38 rss: 75Mb L: 64/72 MS: 1 InsertByte- 00:09:32.272 [2024-11-20 15:11:10.698914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.272 [2024-11-20 15:11:10.698942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.272 [2024-11-20 15:11:10.698981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.272 [2024-11-20 15:11:10.698998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.272 #39 NEW cov: 12540 ft: 15305 corp: 25/1221b lim: 85 exec/s: 39 rss: 75Mb L: 36/72 MS: 1 InsertByte- 00:09:32.272 [2024-11-20 15:11:10.738996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.272 [2024-11-20 15:11:10.739023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.272 [2024-11-20 15:11:10.739063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.272 [2024-11-20 15:11:10.739079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.272 #40 NEW cov: 12540 ft: 15345 corp: 26/1260b lim: 85 exec/s: 40 rss: 75Mb L: 39/72 MS: 1 CrossOver- 00:09:32.272 [2024-11-20 15:11:10.799152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.272 [2024-11-20 15:11:10.799179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.272 [2024-11-20 15:11:10.799218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.272 [2024-11-20 15:11:10.799234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.272 #41 NEW cov: 12540 ft: 15347 corp: 27/1295b lim: 85 exec/s: 41 rss: 75Mb L: 35/72 MS: 1 CopyPart- 00:09:32.272 [2024-11-20 15:11:10.859689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.272 [2024-11-20 15:11:10.859718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.272 [2024-11-20 15:11:10.859769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.272 [2024-11-20 15:11:10.859785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.272 [2024-11-20 15:11:10.859839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.272 [2024-11-20 15:11:10.859854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.272 [2024-11-20 15:11:10.859910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.272 [2024-11-20 15:11:10.859927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.272 #42 NEW cov: 12540 ft: 15359 corp: 28/1368b lim: 85 exec/s: 42 rss: 75Mb L: 73/73 MS: 1 InsertByte- 00:09:32.272 [2024-11-20 15:11:10.919555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.272 [2024-11-20 15:11:10.919582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.272 [2024-11-20 15:11:10.919620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.272 [2024-11-20 15:11:10.919636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.531 #43 NEW cov: 12540 ft: 15390 corp: 29/1403b lim: 85 exec/s: 43 rss: 75Mb L: 35/73 MS: 1 InsertByte- 00:09:32.531 [2024-11-20 15:11:10.980004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.531 [2024-11-20 15:11:10.980031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.531 [2024-11-20 15:11:10.980085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.531 [2024-11-20 15:11:10.980101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.531 [2024-11-20 15:11:10.980154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.531 [2024-11-20 15:11:10.980173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.531 [2024-11-20 15:11:10.980227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.531 [2024-11-20 15:11:10.980244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.531 #44 NEW cov: 12540 ft: 15396 corp: 30/1475b lim: 85 exec/s: 44 rss: 75Mb L: 72/73 MS: 1 ChangeBinInt- 00:09:32.531 [2024-11-20 15:11:11.020147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.531 [2024-11-20 15:11:11.020173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.531 [2024-11-20 15:11:11.020210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.531 [2024-11-20 15:11:11.020227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.531 [2024-11-20 15:11:11.020280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.531 [2024-11-20 15:11:11.020296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.531 [2024-11-20 15:11:11.020367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.531 [2024-11-20 15:11:11.020384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.531 #45 NEW cov: 12540 ft: 15402 corp: 31/1547b lim: 85 exec/s: 45 rss: 75Mb L: 72/73 MS: 1 CopyPart- 00:09:32.531 [2024-11-20 15:11:11.060268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.531 [2024-11-20 15:11:11.060294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.531 [2024-11-20 15:11:11.060353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.531 [2024-11-20 15:11:11.060370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.532 [2024-11-20 15:11:11.060421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.532 [2024-11-20 15:11:11.060437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.532 [2024-11-20 15:11:11.060507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.532 [2024-11-20 15:11:11.060523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.532 #46 NEW cov: 12540 ft: 15407 corp: 32/1623b lim: 85 exec/s: 46 rss: 75Mb L: 76/76 MS: 1 InsertRepeatedBytes- 00:09:32.532 [2024-11-20 15:11:11.100048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.532 [2024-11-20 15:11:11.100075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.532 [2024-11-20 15:11:11.100113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.532 [2024-11-20 15:11:11.100129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.532 #47 NEW cov: 12540 ft: 15425 corp: 33/1659b lim: 85 exec/s: 47 rss: 75Mb L: 36/76 MS: 1 CMP- DE: "\000\000\000\000"- 00:09:32.532 [2024-11-20 15:11:11.140295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.532 [2024-11-20 15:11:11.140327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.532 [2024-11-20 15:11:11.140373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.532 [2024-11-20 15:11:11.140389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.532 [2024-11-20 15:11:11.140446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.532 [2024-11-20 15:11:11.140462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.532 #48 NEW cov: 12540 ft: 15455 corp: 34/1722b lim: 85 exec/s: 48 rss: 75Mb L: 63/76 MS: 1 ChangeBit- 00:09:32.532 [2024-11-20 15:11:11.180589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.532 [2024-11-20 15:11:11.180617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.532 [2024-11-20 15:11:11.180685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.532 [2024-11-20 15:11:11.180703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.532 [2024-11-20 15:11:11.180759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.532 [2024-11-20 15:11:11.180774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.532 [2024-11-20 15:11:11.180829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.532 [2024-11-20 15:11:11.180846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.792 [2024-11-20 15:11:11.220763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.792 [2024-11-20 15:11:11.220792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.792 [2024-11-20 15:11:11.220842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.792 [2024-11-20 15:11:11.220859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.792 [2024-11-20 15:11:11.220912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.792 [2024-11-20 15:11:11.220930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.792 [2024-11-20 15:11:11.220986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.792 [2024-11-20 15:11:11.221002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.792 #50 NEW cov: 12540 ft: 15497 corp: 35/1796b lim: 85 exec/s: 25 rss: 75Mb L: 74/76 MS: 2 InsertByte-InsertByte- 00:09:32.792 #50 DONE cov: 12540 ft: 15497 corp: 35/1796b lim: 85 exec/s: 25 rss: 75Mb 00:09:32.792 ###### Recommended dictionary. ###### 00:09:32.792 "\000\000\000\000" # Uses: 0 00:09:32.792 ###### End of recommended dictionary. ###### 00:09:32.792 Done 50 runs in 2 second(s) 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:32.792 15:11:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:09:32.792 [2024-11-20 15:11:11.396657] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:32.792 [2024-11-20 15:11:11.396748] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1481107 ] 00:09:33.051 [2024-11-20 15:11:11.600620] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.051 [2024-11-20 15:11:11.615617] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.051 [2024-11-20 15:11:11.668520] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:33.051 [2024-11-20 15:11:11.684757] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:09:33.051 INFO: Running with entropic power schedule (0xFF, 100). 00:09:33.051 INFO: Seed: 4160327849 00:09:33.051 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:33.051 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:33.051 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:33.051 INFO: A corpus is not provided, starting from an empty corpus 00:09:33.051 #2 INITED exec/s: 0 rss: 66Mb 00:09:33.051 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:33.051 This may also happen if the target rejected all inputs we tried so far 00:09:33.312 [2024-11-20 15:11:11.740184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.312 [2024-11-20 15:11:11.740217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.312 [2024-11-20 15:11:11.740275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.312 [2024-11-20 15:11:11.740291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.571 NEW_FUNC[1/716]: 0x4840b8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:09:33.571 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:33.571 #8 NEW cov: 12246 ft: 12245 corp: 2/13b lim: 25 exec/s: 0 rss: 73Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:09:33.571 [2024-11-20 15:11:12.081203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.571 [2024-11-20 15:11:12.081241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.571 [2024-11-20 15:11:12.081309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.571 [2024-11-20 15:11:12.081334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.571 [2024-11-20 15:11:12.081390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:33.571 [2024-11-20 15:11:12.081406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.571 #12 NEW cov: 12359 ft: 13098 corp: 3/28b lim: 25 exec/s: 0 rss: 73Mb L: 15/15 MS: 4 ChangeByte-InsertByte-CrossOver-InsertRepeatedBytes- 00:09:33.571 [2024-11-20 15:11:12.121105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.572 [2024-11-20 15:11:12.121134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.572 [2024-11-20 15:11:12.121202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.572 [2024-11-20 15:11:12.121219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.572 #13 NEW cov: 12365 ft: 13430 corp: 4/40b lim: 25 exec/s: 0 rss: 73Mb L: 12/15 MS: 1 ChangeByte- 00:09:33.572 [2024-11-20 15:11:12.181246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.572 [2024-11-20 15:11:12.181273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.572 [2024-11-20 15:11:12.181357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.572 [2024-11-20 15:11:12.181373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.572 #14 NEW cov: 12450 ft: 13767 corp: 5/52b lim: 25 exec/s: 0 rss: 73Mb L: 12/15 MS: 1 CrossOver- 00:09:33.572 [2024-11-20 15:11:12.221457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.572 [2024-11-20 15:11:12.221486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.572 [2024-11-20 15:11:12.221543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.572 [2024-11-20 15:11:12.221558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.572 [2024-11-20 15:11:12.221617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:33.572 [2024-11-20 15:11:12.221633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.831 #15 NEW cov: 12450 ft: 13808 corp: 6/68b lim: 25 exec/s: 0 rss: 74Mb L: 16/16 MS: 1 CopyPart- 00:09:33.831 [2024-11-20 15:11:12.281515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.831 [2024-11-20 15:11:12.281541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.281597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.831 [2024-11-20 15:11:12.281614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.831 #16 NEW cov: 12450 ft: 13872 corp: 7/80b lim: 25 exec/s: 0 rss: 74Mb L: 12/16 MS: 1 ChangeBinInt- 00:09:33.831 [2024-11-20 15:11:12.341788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.831 [2024-11-20 15:11:12.341817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.341872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.831 [2024-11-20 15:11:12.341887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.341945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:33.831 [2024-11-20 15:11:12.341961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.831 #17 NEW cov: 12450 ft: 13966 corp: 8/97b lim: 25 exec/s: 0 rss: 74Mb L: 17/17 MS: 1 CopyPart- 00:09:33.831 [2024-11-20 15:11:12.402124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.831 [2024-11-20 15:11:12.402152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.402205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.831 [2024-11-20 15:11:12.402221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.402277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:33.831 [2024-11-20 15:11:12.402292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.402354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:33.831 [2024-11-20 15:11:12.402371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.831 #19 NEW cov: 12450 ft: 14423 corp: 9/119b lim: 25 exec/s: 0 rss: 74Mb L: 22/22 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:33.831 [2024-11-20 15:11:12.441959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.831 [2024-11-20 15:11:12.441986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.442054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.831 [2024-11-20 15:11:12.442071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.831 #20 NEW cov: 12450 ft: 14455 corp: 10/131b lim: 25 exec/s: 0 rss: 74Mb L: 12/22 MS: 1 ChangeBinInt- 00:09:33.831 [2024-11-20 15:11:12.482207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:33.831 [2024-11-20 15:11:12.482233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.482290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:33.831 [2024-11-20 15:11:12.482304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.831 [2024-11-20 15:11:12.482370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:33.831 [2024-11-20 15:11:12.482388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.091 #21 NEW cov: 12450 ft: 14507 corp: 11/147b lim: 25 exec/s: 0 rss: 74Mb L: 16/22 MS: 1 ChangeBit- 00:09:34.091 [2024-11-20 15:11:12.542242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.091 [2024-11-20 15:11:12.542268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.091 [2024-11-20 15:11:12.542346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.091 [2024-11-20 15:11:12.542369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.091 #22 NEW cov: 12450 ft: 14539 corp: 12/159b lim: 25 exec/s: 0 rss: 74Mb L: 12/22 MS: 1 ChangeBinInt- 00:09:34.091 [2024-11-20 15:11:12.582349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.091 [2024-11-20 15:11:12.582375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.091 [2024-11-20 15:11:12.582430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.091 [2024-11-20 15:11:12.582447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.091 #23 NEW cov: 12450 ft: 14567 corp: 13/171b lim: 25 exec/s: 0 rss: 74Mb L: 12/22 MS: 1 ChangeByte- 00:09:34.091 [2024-11-20 15:11:12.622468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.091 [2024-11-20 15:11:12.622495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.091 [2024-11-20 15:11:12.622552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.091 [2024-11-20 15:11:12.622569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.091 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:34.091 #24 NEW cov: 12473 ft: 14611 corp: 14/183b lim: 25 exec/s: 0 rss: 74Mb L: 12/22 MS: 1 ChangeByte- 00:09:34.091 [2024-11-20 15:11:12.682684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.091 [2024-11-20 15:11:12.682711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.091 [2024-11-20 15:11:12.682754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.091 [2024-11-20 15:11:12.682770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.091 #25 NEW cov: 12473 ft: 14697 corp: 15/193b lim: 25 exec/s: 0 rss: 74Mb L: 10/22 MS: 1 EraseBytes- 00:09:34.091 [2024-11-20 15:11:12.722771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.091 [2024-11-20 15:11:12.722798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.091 [2024-11-20 15:11:12.722853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.091 [2024-11-20 15:11:12.722870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.091 #26 NEW cov: 12473 ft: 14701 corp: 16/206b lim: 25 exec/s: 26 rss: 74Mb L: 13/22 MS: 1 InsertByte- 00:09:34.091 [2024-11-20 15:11:12.763172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.091 [2024-11-20 15:11:12.763200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.091 [2024-11-20 15:11:12.763256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.091 [2024-11-20 15:11:12.763271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.091 [2024-11-20 15:11:12.763332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.091 [2024-11-20 15:11:12.763357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.091 [2024-11-20 15:11:12.763415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:34.091 [2024-11-20 15:11:12.763432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.350 #27 NEW cov: 12473 ft: 14762 corp: 17/230b lim: 25 exec/s: 27 rss: 74Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:09:34.350 [2024-11-20 15:11:12.823053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.350 [2024-11-20 15:11:12.823081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.350 [2024-11-20 15:11:12.823154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.350 [2024-11-20 15:11:12.823168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.350 #28 NEW cov: 12473 ft: 14771 corp: 18/242b lim: 25 exec/s: 28 rss: 74Mb L: 12/24 MS: 1 CrossOver- 00:09:34.350 [2024-11-20 15:11:12.883335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.350 [2024-11-20 15:11:12.883364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.350 [2024-11-20 15:11:12.883415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.350 [2024-11-20 15:11:12.883432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.350 [2024-11-20 15:11:12.883503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.350 [2024-11-20 15:11:12.883520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.350 #29 NEW cov: 12473 ft: 14797 corp: 19/258b lim: 25 exec/s: 29 rss: 74Mb L: 16/24 MS: 1 CrossOver- 00:09:34.350 [2024-11-20 15:11:12.923395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.350 [2024-11-20 15:11:12.923424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.350 [2024-11-20 15:11:12.923475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.350 [2024-11-20 15:11:12.923491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.351 #30 NEW cov: 12473 ft: 14827 corp: 20/270b lim: 25 exec/s: 30 rss: 74Mb L: 12/24 MS: 1 ChangeByte- 00:09:34.351 [2024-11-20 15:11:12.983172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.351 [2024-11-20 15:11:12.983200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.351 [2024-11-20 15:11:12.983241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.351 [2024-11-20 15:11:12.983257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.351 #31 NEW cov: 12473 ft: 14957 corp: 21/283b lim: 25 exec/s: 31 rss: 74Mb L: 13/24 MS: 1 CopyPart- 00:09:34.610 [2024-11-20 15:11:13.043943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.610 [2024-11-20 15:11:13.043973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.044025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.610 [2024-11-20 15:11:13.044041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.044104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.610 [2024-11-20 15:11:13.044122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.044178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:34.610 [2024-11-20 15:11:13.044194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.610 #32 NEW cov: 12473 ft: 14972 corp: 22/306b lim: 25 exec/s: 32 rss: 74Mb L: 23/24 MS: 1 CMP- DE: "\017\000\000\000\000\000\000\000"- 00:09:34.610 [2024-11-20 15:11:13.083898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.610 [2024-11-20 15:11:13.083926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.083974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.610 [2024-11-20 15:11:13.083991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.084051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.610 [2024-11-20 15:11:13.084067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.610 #33 NEW cov: 12473 ft: 14996 corp: 23/322b lim: 25 exec/s: 33 rss: 74Mb L: 16/24 MS: 1 CopyPart- 00:09:34.610 [2024-11-20 15:11:13.124160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.610 [2024-11-20 15:11:13.124187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.124241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.610 [2024-11-20 15:11:13.124257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.124319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.610 [2024-11-20 15:11:13.124336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.124393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:34.610 [2024-11-20 15:11:13.124409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.610 #34 NEW cov: 12473 ft: 15007 corp: 24/343b lim: 25 exec/s: 34 rss: 74Mb L: 21/24 MS: 1 InsertRepeatedBytes- 00:09:34.610 [2024-11-20 15:11:13.164187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.610 [2024-11-20 15:11:13.164216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.164265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.610 [2024-11-20 15:11:13.164282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.164345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.610 [2024-11-20 15:11:13.164362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.610 #35 NEW cov: 12473 ft: 15021 corp: 25/359b lim: 25 exec/s: 35 rss: 74Mb L: 16/24 MS: 1 CMP- DE: "\377\000"- 00:09:34.610 [2024-11-20 15:11:13.224447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.610 [2024-11-20 15:11:13.224474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.224549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.610 [2024-11-20 15:11:13.224564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.224620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.610 [2024-11-20 15:11:13.224635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.610 [2024-11-20 15:11:13.224692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:34.610 [2024-11-20 15:11:13.224707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.610 #36 NEW cov: 12473 ft: 15038 corp: 26/381b lim: 25 exec/s: 36 rss: 74Mb L: 22/24 MS: 1 CopyPart- 00:09:34.610 [2024-11-20 15:11:13.284670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.611 [2024-11-20 15:11:13.284697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.611 [2024-11-20 15:11:13.284738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.611 [2024-11-20 15:11:13.284754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.611 [2024-11-20 15:11:13.284808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.611 [2024-11-20 15:11:13.284824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.611 [2024-11-20 15:11:13.284881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:34.611 [2024-11-20 15:11:13.284895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.870 #37 NEW cov: 12473 ft: 15065 corp: 27/403b lim: 25 exec/s: 37 rss: 74Mb L: 22/24 MS: 1 ChangeByte- 00:09:34.870 [2024-11-20 15:11:13.324491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.870 [2024-11-20 15:11:13.324516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.870 [2024-11-20 15:11:13.324560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.870 [2024-11-20 15:11:13.324576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.870 #38 NEW cov: 12473 ft: 15164 corp: 28/413b lim: 25 exec/s: 38 rss: 74Mb L: 10/24 MS: 1 EraseBytes- 00:09:34.870 [2024-11-20 15:11:13.364729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.870 [2024-11-20 15:11:13.364756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.870 [2024-11-20 15:11:13.364827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.870 [2024-11-20 15:11:13.364844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.870 [2024-11-20 15:11:13.364901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.870 [2024-11-20 15:11:13.364917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.870 #39 NEW cov: 12473 ft: 15175 corp: 29/428b lim: 25 exec/s: 39 rss: 74Mb L: 15/24 MS: 1 CrossOver- 00:09:34.870 [2024-11-20 15:11:13.404689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.870 [2024-11-20 15:11:13.404715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.870 [2024-11-20 15:11:13.404753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.870 [2024-11-20 15:11:13.404769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.870 #40 NEW cov: 12473 ft: 15182 corp: 30/440b lim: 25 exec/s: 40 rss: 74Mb L: 12/24 MS: 1 CrossOver- 00:09:34.870 [2024-11-20 15:11:13.445103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.870 [2024-11-20 15:11:13.445130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.870 [2024-11-20 15:11:13.445187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.870 [2024-11-20 15:11:13.445201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.870 [2024-11-20 15:11:13.445272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.870 [2024-11-20 15:11:13.445286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.870 [2024-11-20 15:11:13.445344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:34.870 [2024-11-20 15:11:13.445361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.870 #41 NEW cov: 12473 ft: 15263 corp: 31/464b lim: 25 exec/s: 41 rss: 74Mb L: 24/24 MS: 1 CrossOver- 00:09:34.871 [2024-11-20 15:11:13.485015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.871 [2024-11-20 15:11:13.485042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.871 [2024-11-20 15:11:13.485092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.871 [2024-11-20 15:11:13.485108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.871 [2024-11-20 15:11:13.485162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.871 [2024-11-20 15:11:13.485177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.871 #42 NEW cov: 12473 ft: 15326 corp: 32/481b lim: 25 exec/s: 42 rss: 75Mb L: 17/24 MS: 1 ChangeBit- 00:09:34.871 [2024-11-20 15:11:13.545440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.871 [2024-11-20 15:11:13.545466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.871 [2024-11-20 15:11:13.545526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.871 [2024-11-20 15:11:13.545541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.871 [2024-11-20 15:11:13.545595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.871 [2024-11-20 15:11:13.545610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.871 [2024-11-20 15:11:13.545666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:34.871 [2024-11-20 15:11:13.545684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:35.130 #43 NEW cov: 12473 ft: 15371 corp: 33/504b lim: 25 exec/s: 43 rss: 75Mb L: 23/24 MS: 1 ChangeBit- 00:09:35.130 [2024-11-20 15:11:13.605553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.130 [2024-11-20 15:11:13.605580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.130 [2024-11-20 15:11:13.605642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.130 [2024-11-20 15:11:13.605657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.130 [2024-11-20 15:11:13.605710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.130 [2024-11-20 15:11:13.605726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.130 [2024-11-20 15:11:13.605782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:35.130 [2024-11-20 15:11:13.605795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:35.130 #44 NEW cov: 12473 ft: 15403 corp: 34/526b lim: 25 exec/s: 44 rss: 75Mb L: 22/24 MS: 1 CrossOver- 00:09:35.130 [2024-11-20 15:11:13.645769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.130 [2024-11-20 15:11:13.645797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.130 [2024-11-20 15:11:13.645854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.130 [2024-11-20 15:11:13.645869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.130 [2024-11-20 15:11:13.645925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.130 [2024-11-20 15:11:13.645939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.130 [2024-11-20 15:11:13.645992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:35.131 [2024-11-20 15:11:13.646007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:35.131 [2024-11-20 15:11:13.646060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:09:35.131 [2024-11-20 15:11:13.646075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:35.131 #45 NEW cov: 12473 ft: 15453 corp: 35/551b lim: 25 exec/s: 45 rss: 75Mb L: 25/25 MS: 1 CopyPart- 00:09:35.131 [2024-11-20 15:11:13.705803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.131 [2024-11-20 15:11:13.705829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.131 [2024-11-20 15:11:13.705888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.131 [2024-11-20 15:11:13.705903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.131 [2024-11-20 15:11:13.705974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.131 [2024-11-20 15:11:13.705989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.131 [2024-11-20 15:11:13.706045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:35.131 [2024-11-20 15:11:13.706064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:35.131 #46 NEW cov: 12473 ft: 15546 corp: 36/573b lim: 25 exec/s: 23 rss: 75Mb L: 22/25 MS: 1 ChangeBinInt- 00:09:35.131 #46 DONE cov: 12473 ft: 15546 corp: 36/573b lim: 25 exec/s: 23 rss: 75Mb 00:09:35.131 ###### Recommended dictionary. ###### 00:09:35.131 "\017\000\000\000\000\000\000\000" # Uses: 0 00:09:35.131 "\377\000" # Uses: 0 00:09:35.131 ###### End of recommended dictionary. ###### 00:09:35.131 Done 46 runs in 2 second(s) 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:35.390 15:11:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:09:35.390 [2024-11-20 15:11:13.883307] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:35.390 [2024-11-20 15:11:13.883386] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1481457 ] 00:09:35.649 [2024-11-20 15:11:14.081213] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:35.649 [2024-11-20 15:11:14.096219] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.649 [2024-11-20 15:11:14.149106] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:35.649 [2024-11-20 15:11:14.165363] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:09:35.649 INFO: Running with entropic power schedule (0xFF, 100). 00:09:35.649 INFO: Seed: 2345363766 00:09:35.649 INFO: Loaded 1 modules (388394 inline 8-bit counters): 388394 [0x2ae000c, 0x2b3ed36), 00:09:35.649 INFO: Loaded 1 PC tables (388394 PCs): 388394 [0x2b3ed38,0x312bfd8), 00:09:35.649 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:35.649 INFO: A corpus is not provided, starting from an empty corpus 00:09:35.649 #2 INITED exec/s: 0 rss: 66Mb 00:09:35.649 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:35.649 This may also happen if the target rejected all inputs we tried so far 00:09:35.649 [2024-11-20 15:11:14.232472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253576191 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.649 [2024-11-20 15:11:14.232519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.908 NEW_FUNC[1/717]: 0x4851a8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:09:35.908 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:35.908 #5 NEW cov: 12318 ft: 12317 corp: 2/28b lim: 100 exec/s: 0 rss: 73Mb L: 27/27 MS: 3 ChangeBinInt-InsertByte-InsertRepeatedBytes- 00:09:35.908 [2024-11-20 15:11:14.573464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253511935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.908 [2024-11-20 15:11:14.573508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.168 #6 NEW cov: 12431 ft: 12953 corp: 3/55b lim: 100 exec/s: 0 rss: 73Mb L: 27/27 MS: 1 ChangeBinInt- 00:09:36.168 [2024-11-20 15:11:14.643683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446743052346327039 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.168 [2024-11-20 15:11:14.643715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.168 #12 NEW cov: 12437 ft: 13200 corp: 4/83b lim: 100 exec/s: 0 rss: 73Mb L: 28/28 MS: 1 InsertByte- 00:09:36.168 [2024-11-20 15:11:14.693758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069588020737 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.168 [2024-11-20 15:11:14.693789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.168 #15 NEW cov: 12522 ft: 13485 corp: 5/113b lim: 100 exec/s: 0 rss: 73Mb L: 30/30 MS: 3 InsertByte-InsertByte-CrossOver- 00:09:36.168 [2024-11-20 15:11:14.744125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069588020737 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.168 [2024-11-20 15:11:14.744154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.168 #16 NEW cov: 12522 ft: 13552 corp: 6/143b lim: 100 exec/s: 0 rss: 74Mb L: 30/30 MS: 1 ChangeBinInt- 00:09:36.168 [2024-11-20 15:11:14.814342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253511935 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.168 [2024-11-20 15:11:14.814371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.427 #17 NEW cov: 12522 ft: 13615 corp: 7/170b lim: 100 exec/s: 0 rss: 74Mb L: 27/30 MS: 1 ChangeBinInt- 00:09:36.427 [2024-11-20 15:11:14.884582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253511423 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.427 [2024-11-20 15:11:14.884612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.427 #18 NEW cov: 12522 ft: 13648 corp: 8/197b lim: 100 exec/s: 0 rss: 74Mb L: 27/30 MS: 1 ChangeBinInt- 00:09:36.427 [2024-11-20 15:11:14.954932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253511935 len:40961 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.427 [2024-11-20 15:11:14.954961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.427 #19 NEW cov: 12522 ft: 13718 corp: 9/225b lim: 100 exec/s: 0 rss: 74Mb L: 28/30 MS: 1 InsertByte- 00:09:36.427 [2024-11-20 15:11:15.005127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253576191 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.427 [2024-11-20 15:11:15.005159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.427 #20 NEW cov: 12522 ft: 13770 corp: 10/252b lim: 100 exec/s: 0 rss: 74Mb L: 27/30 MS: 1 ChangeByte- 00:09:36.427 [2024-11-20 15:11:15.055797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463716263265535 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.427 [2024-11-20 15:11:15.055831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.427 [2024-11-20 15:11:15.055935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.427 [2024-11-20 15:11:15.055956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.427 #21 NEW cov: 12522 ft: 14580 corp: 11/302b lim: 100 exec/s: 0 rss: 74Mb L: 50/50 MS: 1 CopyPart- 00:09:36.427 [2024-11-20 15:11:15.105881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861995870119 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.427 [2024-11-20 15:11:15.105911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.686 NEW_FUNC[1/1]: 0x1c5cca8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:36.686 #25 NEW cov: 12545 ft: 14610 corp: 12/332b lim: 100 exec/s: 0 rss: 74Mb L: 30/50 MS: 4 CrossOver-CopyPart-ChangeByte-InsertRepeatedBytes- 00:09:36.686 [2024-11-20 15:11:15.156247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070256460287 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.686 [2024-11-20 15:11:15.156279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.686 #26 NEW cov: 12545 ft: 14644 corp: 13/361b lim: 100 exec/s: 0 rss: 74Mb L: 29/50 MS: 1 InsertByte- 00:09:36.686 [2024-11-20 15:11:15.226697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253576191 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.686 [2024-11-20 15:11:15.226729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.686 #27 NEW cov: 12545 ft: 14653 corp: 14/396b lim: 100 exec/s: 27 rss: 74Mb L: 35/50 MS: 1 CMP- DE: "\001E$\253\342\262B\204"- 00:09:36.686 [2024-11-20 15:11:15.276932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:360287966733664255 len:40961 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.686 [2024-11-20 15:11:15.276966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.686 #28 NEW cov: 12545 ft: 14663 corp: 15/424b lim: 100 exec/s: 28 rss: 74Mb L: 28/50 MS: 1 ShuffleBytes- 00:09:36.686 [2024-11-20 15:11:15.347682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463742033069311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.686 [2024-11-20 15:11:15.347718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.686 [2024-11-20 15:11:15.347794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.686 [2024-11-20 15:11:15.347816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.945 #29 NEW cov: 12545 ft: 14690 corp: 16/474b lim: 100 exec/s: 29 rss: 74Mb L: 50/50 MS: 1 ChangeByte- 00:09:36.945 [2024-11-20 15:11:15.417929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070256460287 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.945 [2024-11-20 15:11:15.417961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.945 #30 NEW cov: 12545 ft: 14722 corp: 17/503b lim: 100 exec/s: 30 rss: 74Mb L: 29/50 MS: 1 ChangeBit- 00:09:36.945 [2024-11-20 15:11:15.488487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253511935 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.945 [2024-11-20 15:11:15.488517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.945 #31 NEW cov: 12545 ft: 14784 corp: 18/531b lim: 100 exec/s: 31 rss: 74Mb L: 28/50 MS: 1 ShuffleBytes- 00:09:36.945 [2024-11-20 15:11:15.538639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069588020737 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.945 [2024-11-20 15:11:15.538669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.945 #32 NEW cov: 12545 ft: 14802 corp: 19/569b lim: 100 exec/s: 32 rss: 74Mb L: 38/50 MS: 1 InsertRepeatedBytes- 00:09:36.945 [2024-11-20 15:11:15.589112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070256460287 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:36.945 [2024-11-20 15:11:15.589142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.205 #33 NEW cov: 12545 ft: 14821 corp: 20/599b lim: 100 exec/s: 33 rss: 74Mb L: 30/50 MS: 1 InsertByte- 00:09:37.205 [2024-11-20 15:11:15.659564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446743052346327039 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.205 [2024-11-20 15:11:15.659596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.205 #34 NEW cov: 12545 ft: 14881 corp: 21/627b lim: 100 exec/s: 34 rss: 74Mb L: 28/50 MS: 1 ChangeBit- 00:09:37.205 [2024-11-20 15:11:15.710766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253576191 len:512 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.205 [2024-11-20 15:11:15.710798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.205 [2024-11-20 15:11:15.710871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.205 [2024-11-20 15:11:15.710891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.205 [2024-11-20 15:11:15.710963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446692396663046143 len:65282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.205 [2024-11-20 15:11:15.710985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.205 #35 NEW cov: 12545 ft: 15241 corp: 22/689b lim: 100 exec/s: 35 rss: 74Mb L: 62/62 MS: 1 CrossOver- 00:09:37.205 [2024-11-20 15:11:15.780365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070253511423 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.205 [2024-11-20 15:11:15.780394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.205 #36 NEW cov: 12545 ft: 15301 corp: 23/716b lim: 100 exec/s: 36 rss: 75Mb L: 27/62 MS: 1 ChangeBit- 00:09:37.205 [2024-11-20 15:11:15.850646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070256460287 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.205 [2024-11-20 15:11:15.850676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.205 #37 NEW cov: 12545 ft: 15304 corp: 24/746b lim: 100 exec/s: 37 rss: 75Mb L: 30/62 MS: 1 InsertByte- 00:09:37.464 [2024-11-20 15:11:15.900883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12441096832185509799 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.464 [2024-11-20 15:11:15.900913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.464 #38 NEW cov: 12545 ft: 15321 corp: 25/776b lim: 100 exec/s: 38 rss: 75Mb L: 30/62 MS: 1 ChangeBinInt- 00:09:37.464 [2024-11-20 15:11:15.971519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463716263265535 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.464 [2024-11-20 15:11:15.971549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.464 [2024-11-20 15:11:15.971619] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.464 [2024-11-20 15:11:15.971649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.464 #39 NEW cov: 12545 ft: 15340 corp: 26/827b lim: 100 exec/s: 39 rss: 75Mb L: 51/62 MS: 1 CopyPart- 00:09:37.464 [2024-11-20 15:11:16.022039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446743052346327039 len:2828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.464 [2024-11-20 15:11:16.022071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.464 [2024-11-20 15:11:16.022143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:795741901218843403 len:2828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.464 [2024-11-20 15:11:16.022163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.464 [2024-11-20 15:11:16.022240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:795741901218843403 len:2828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.464 [2024-11-20 15:11:16.022259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.464 #40 NEW cov: 12545 ft: 15353 corp: 27/905b lim: 100 exec/s: 40 rss: 75Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:09:37.464 [2024-11-20 15:11:16.091752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:360287966733664255 len:40961 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.464 [2024-11-20 15:11:16.091782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.464 #41 NEW cov: 12545 ft: 15369 corp: 28/933b lim: 100 exec/s: 41 rss: 75Mb L: 28/78 MS: 1 ChangeBit- 00:09:37.723 [2024-11-20 15:11:16.162515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463716263265535 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.723 [2024-11-20 15:11:16.162547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.723 [2024-11-20 15:11:16.162620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.723 [2024-11-20 15:11:16.162638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.723 #42 NEW cov: 12545 ft: 15418 corp: 29/983b lim: 100 exec/s: 42 rss: 75Mb L: 50/78 MS: 1 CrossOver- 00:09:37.723 [2024-11-20 15:11:16.213344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463742033069311 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.723 [2024-11-20 15:11:16.213375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.723 [2024-11-20 15:11:16.213448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.724 [2024-11-20 15:11:16.213466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.724 [2024-11-20 15:11:16.213538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6872316419617283935 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.724 [2024-11-20 15:11:16.213557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.724 #43 NEW cov: 12545 ft: 15430 corp: 30/1050b lim: 100 exec/s: 21 rss: 75Mb L: 67/78 MS: 1 InsertRepeatedBytes- 00:09:37.724 #43 DONE cov: 12545 ft: 15430 corp: 30/1050b lim: 100 exec/s: 21 rss: 75Mb 00:09:37.724 ###### Recommended dictionary. ###### 00:09:37.724 "\001E$\253\342\262B\204" # Uses: 0 00:09:37.724 ###### End of recommended dictionary. ###### 00:09:37.724 Done 43 runs in 2 second(s) 00:09:37.724 15:11:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:09:37.724 15:11:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:37.724 15:11:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:37.724 15:11:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:09:37.724 00:09:37.724 real 1m3.134s 00:09:37.724 user 1m39.309s 00:09:37.724 sys 0m7.517s 00:09:37.724 15:11:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:37.724 15:11:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:37.724 ************************************ 00:09:37.724 END TEST nvmf_llvm_fuzz 00:09:37.724 ************************************ 00:09:37.724 15:11:16 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:09:37.724 15:11:16 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:09:37.724 15:11:16 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:37.724 15:11:16 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:09:37.724 15:11:16 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:37.724 15:11:16 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:37.984 ************************************ 00:09:37.984 START TEST vfio_llvm_fuzz 00:09:37.984 ************************************ 00:09:37.984 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:37.984 * Looking for test storage... 00:09:37.984 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:37.984 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:09:37.984 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:09:37.984 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:09:37.984 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:09:37.984 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:37.984 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:37.984 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:09:37.985 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:37.985 --rc genhtml_branch_coverage=1 00:09:37.985 --rc genhtml_function_coverage=1 00:09:37.985 --rc genhtml_legend=1 00:09:37.985 --rc geninfo_all_blocks=1 00:09:37.985 --rc geninfo_unexecuted_blocks=1 00:09:37.985 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:37.985 ' 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:09:37.985 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:37.985 --rc genhtml_branch_coverage=1 00:09:37.985 --rc genhtml_function_coverage=1 00:09:37.985 --rc genhtml_legend=1 00:09:37.985 --rc geninfo_all_blocks=1 00:09:37.985 --rc geninfo_unexecuted_blocks=1 00:09:37.985 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:37.985 ' 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:09:37.985 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:37.985 --rc genhtml_branch_coverage=1 00:09:37.985 --rc genhtml_function_coverage=1 00:09:37.985 --rc genhtml_legend=1 00:09:37.985 --rc geninfo_all_blocks=1 00:09:37.985 --rc geninfo_unexecuted_blocks=1 00:09:37.985 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:37.985 ' 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:09:37.985 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:37.985 --rc genhtml_branch_coverage=1 00:09:37.985 --rc genhtml_function_coverage=1 00:09:37.985 --rc genhtml_legend=1 00:09:37.985 --rc geninfo_all_blocks=1 00:09:37.985 --rc geninfo_unexecuted_blocks=1 00:09:37.985 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:37.985 ' 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:09:37.985 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:09:37.986 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:09:37.986 #define SPDK_CONFIG_H 00:09:37.986 #define SPDK_CONFIG_AIO_FSDEV 1 00:09:37.986 #define SPDK_CONFIG_APPS 1 00:09:37.986 #define SPDK_CONFIG_ARCH native 00:09:37.986 #undef SPDK_CONFIG_ASAN 00:09:37.986 #undef SPDK_CONFIG_AVAHI 00:09:37.986 #undef SPDK_CONFIG_CET 00:09:37.986 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:09:37.986 #define SPDK_CONFIG_COVERAGE 1 00:09:37.986 #define SPDK_CONFIG_CROSS_PREFIX 00:09:37.986 #undef SPDK_CONFIG_CRYPTO 00:09:37.986 #undef SPDK_CONFIG_CRYPTO_MLX5 00:09:37.986 #undef SPDK_CONFIG_CUSTOMOCF 00:09:37.986 #undef SPDK_CONFIG_DAOS 00:09:37.986 #define SPDK_CONFIG_DAOS_DIR 00:09:37.986 #define SPDK_CONFIG_DEBUG 1 00:09:37.986 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:09:37.986 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:37.986 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:37.986 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:37.986 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:09:37.986 #undef SPDK_CONFIG_DPDK_UADK 00:09:37.986 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:37.986 #define SPDK_CONFIG_EXAMPLES 1 00:09:37.986 #undef SPDK_CONFIG_FC 00:09:37.986 #define SPDK_CONFIG_FC_PATH 00:09:37.986 #define SPDK_CONFIG_FIO_PLUGIN 1 00:09:37.986 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:09:37.986 #define SPDK_CONFIG_FSDEV 1 00:09:37.986 #undef SPDK_CONFIG_FUSE 00:09:37.986 #define SPDK_CONFIG_FUZZER 1 00:09:37.986 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:09:37.986 #undef SPDK_CONFIG_GOLANG 00:09:37.986 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:09:37.986 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:09:37.986 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:09:37.986 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:09:37.986 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:09:37.986 #undef SPDK_CONFIG_HAVE_LIBBSD 00:09:37.986 #undef SPDK_CONFIG_HAVE_LZ4 00:09:37.986 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:09:37.986 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:09:37.986 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:09:37.986 #define SPDK_CONFIG_IDXD 1 00:09:37.986 #define SPDK_CONFIG_IDXD_KERNEL 1 00:09:37.986 #undef SPDK_CONFIG_IPSEC_MB 00:09:37.986 #define SPDK_CONFIG_IPSEC_MB_DIR 00:09:37.986 #define SPDK_CONFIG_ISAL 1 00:09:37.986 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:09:37.986 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:09:37.987 #define SPDK_CONFIG_LIBDIR 00:09:37.987 #undef SPDK_CONFIG_LTO 00:09:37.987 #define SPDK_CONFIG_MAX_LCORES 128 00:09:37.987 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:09:37.987 #define SPDK_CONFIG_NVME_CUSE 1 00:09:37.987 #undef SPDK_CONFIG_OCF 00:09:37.987 #define SPDK_CONFIG_OCF_PATH 00:09:37.987 #define SPDK_CONFIG_OPENSSL_PATH 00:09:37.987 #undef SPDK_CONFIG_PGO_CAPTURE 00:09:37.987 #define SPDK_CONFIG_PGO_DIR 00:09:37.987 #undef SPDK_CONFIG_PGO_USE 00:09:37.987 #define SPDK_CONFIG_PREFIX /usr/local 00:09:37.987 #undef SPDK_CONFIG_RAID5F 00:09:37.987 #undef SPDK_CONFIG_RBD 00:09:37.987 #define SPDK_CONFIG_RDMA 1 00:09:37.987 #define SPDK_CONFIG_RDMA_PROV verbs 00:09:37.987 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:09:37.987 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:09:37.987 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:09:37.987 #undef SPDK_CONFIG_SHARED 00:09:37.987 #undef SPDK_CONFIG_SMA 00:09:37.987 #define SPDK_CONFIG_TESTS 1 00:09:37.987 #undef SPDK_CONFIG_TSAN 00:09:37.987 #define SPDK_CONFIG_UBLK 1 00:09:37.987 #define SPDK_CONFIG_UBSAN 1 00:09:37.987 #undef SPDK_CONFIG_UNIT_TESTS 00:09:37.987 #undef SPDK_CONFIG_URING 00:09:37.987 #define SPDK_CONFIG_URING_PATH 00:09:37.987 #undef SPDK_CONFIG_URING_ZNS 00:09:37.987 #undef SPDK_CONFIG_USDT 00:09:37.987 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:09:37.987 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:09:37.987 #define SPDK_CONFIG_VFIO_USER 1 00:09:37.987 #define SPDK_CONFIG_VFIO_USER_DIR 00:09:37.987 #define SPDK_CONFIG_VHOST 1 00:09:37.987 #define SPDK_CONFIG_VIRTIO 1 00:09:37.987 #undef SPDK_CONFIG_VTUNE 00:09:37.987 #define SPDK_CONFIG_VTUNE_DIR 00:09:37.987 #define SPDK_CONFIG_WERROR 1 00:09:37.987 #define SPDK_CONFIG_WPDK_DIR 00:09:37.987 #undef SPDK_CONFIG_XNVME 00:09:37.987 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:37.987 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:09:38.249 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:09:38.250 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j72 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 1481850 ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 1481850 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.cCpJEc 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.cCpJEc/tests/vfio /tmp/spdk.cCpJEc 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=84497670144 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=94500274176 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=10002604032 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=47245373440 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=47250137088 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=4763648 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=18893950976 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=18900058112 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=6107136 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=47249588224 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=47250137088 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=548864 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=9450012672 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=9450024960 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:09:38.251 * Looking for test storage... 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=84497670144 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=12217196544 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:38.251 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:09:38.251 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:09:38.252 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:38.252 --rc genhtml_branch_coverage=1 00:09:38.252 --rc genhtml_function_coverage=1 00:09:38.252 --rc genhtml_legend=1 00:09:38.252 --rc geninfo_all_blocks=1 00:09:38.252 --rc geninfo_unexecuted_blocks=1 00:09:38.252 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:38.252 ' 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:09:38.252 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:38.252 --rc genhtml_branch_coverage=1 00:09:38.252 --rc genhtml_function_coverage=1 00:09:38.252 --rc genhtml_legend=1 00:09:38.252 --rc geninfo_all_blocks=1 00:09:38.252 --rc geninfo_unexecuted_blocks=1 00:09:38.252 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:38.252 ' 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:09:38.252 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:38.252 --rc genhtml_branch_coverage=1 00:09:38.252 --rc genhtml_function_coverage=1 00:09:38.252 --rc genhtml_legend=1 00:09:38.252 --rc geninfo_all_blocks=1 00:09:38.252 --rc geninfo_unexecuted_blocks=1 00:09:38.252 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:38.252 ' 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:09:38.252 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:38.252 --rc genhtml_branch_coverage=1 00:09:38.252 --rc genhtml_function_coverage=1 00:09:38.252 --rc genhtml_legend=1 00:09:38.252 --rc geninfo_all_blocks=1 00:09:38.252 --rc geninfo_unexecuted_blocks=1 00:09:38.252 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:38.252 ' 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:09:38.252 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:38.252 15:11:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:09:38.511 [2024-11-20 15:11:16.941739] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:38.511 [2024-11-20 15:11:16.941823] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1482011 ] 00:09:38.511 [2024-11-20 15:11:17.037071] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:38.511 [2024-11-20 15:11:17.062903] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:38.768 INFO: Running with entropic power schedule (0xFF, 100). 00:09:38.768 INFO: Seed: 1125424295 00:09:38.768 INFO: Loaded 1 modules (385630 inline 8-bit counters): 385630 [0x2aa180c, 0x2affa6a), 00:09:38.768 INFO: Loaded 1 PC tables (385630 PCs): 385630 [0x2affa70,0x30e2050), 00:09:38.768 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:38.769 INFO: A corpus is not provided, starting from an empty corpus 00:09:38.769 #2 INITED exec/s: 0 rss: 68Mb 00:09:38.769 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:38.769 This may also happen if the target rejected all inputs we tried so far 00:09:38.769 [2024-11-20 15:11:17.308052] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:09:39.284 NEW_FUNC[1/672]: 0x459068 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:09:39.285 NEW_FUNC[2/672]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:39.285 #6 NEW cov: 11167 ft: 11084 corp: 2/7b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 4 CopyPart-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:09:39.285 #7 NEW cov: 11181 ft: 14037 corp: 3/13b lim: 6 exec/s: 0 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:39.543 NEW_FUNC[1/1]: 0x1c290f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:39.543 #10 NEW cov: 11201 ft: 16094 corp: 4/19b lim: 6 exec/s: 0 rss: 76Mb L: 6/6 MS: 3 EraseBytes-ShuffleBytes-CopyPart- 00:09:39.801 #13 NEW cov: 11201 ft: 16879 corp: 5/25b lim: 6 exec/s: 13 rss: 76Mb L: 6/6 MS: 3 ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:09:39.801 #14 NEW cov: 11201 ft: 17037 corp: 6/31b lim: 6 exec/s: 14 rss: 76Mb L: 6/6 MS: 1 CopyPart- 00:09:40.059 #15 NEW cov: 11201 ft: 17715 corp: 7/37b lim: 6 exec/s: 15 rss: 76Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:40.317 #16 NEW cov: 11201 ft: 18081 corp: 8/43b lim: 6 exec/s: 16 rss: 76Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:40.317 #17 NEW cov: 11201 ft: 18170 corp: 9/49b lim: 6 exec/s: 17 rss: 77Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:40.576 #18 NEW cov: 11208 ft: 18335 corp: 10/55b lim: 6 exec/s: 18 rss: 77Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:40.836 #19 NEW cov: 11208 ft: 18398 corp: 11/61b lim: 6 exec/s: 9 rss: 77Mb L: 6/6 MS: 1 CopyPart- 00:09:40.836 #19 DONE cov: 11208 ft: 18398 corp: 11/61b lim: 6 exec/s: 9 rss: 77Mb 00:09:40.836 Done 19 runs in 2 second(s) 00:09:40.836 [2024-11-20 15:11:19.331521] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:09:41.095 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:41.095 15:11:19 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:09:41.095 [2024-11-20 15:11:19.616682] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:41.095 [2024-11-20 15:11:19.616762] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1482376 ] 00:09:41.095 [2024-11-20 15:11:19.713576] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:41.095 [2024-11-20 15:11:19.740065] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.355 INFO: Running with entropic power schedule (0xFF, 100). 00:09:41.355 INFO: Seed: 3806404302 00:09:41.355 INFO: Loaded 1 modules (385630 inline 8-bit counters): 385630 [0x2aa180c, 0x2affa6a), 00:09:41.355 INFO: Loaded 1 PC tables (385630 PCs): 385630 [0x2affa70,0x30e2050), 00:09:41.355 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:41.355 INFO: A corpus is not provided, starting from an empty corpus 00:09:41.355 #2 INITED exec/s: 0 rss: 68Mb 00:09:41.355 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:41.355 This may also happen if the target rejected all inputs we tried so far 00:09:41.355 [2024-11-20 15:11:19.987859] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:09:41.613 [2024-11-20 15:11:20.043364] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:41.613 [2024-11-20 15:11:20.043396] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:41.613 [2024-11-20 15:11:20.043416] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:41.870 NEW_FUNC[1/673]: 0x459608 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:09:41.870 NEW_FUNC[2/673]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:41.870 #16 NEW cov: 11153 ft: 11134 corp: 2/5b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 4 ChangeByte-ShuffleBytes-InsertByte-CopyPart- 00:09:41.870 [2024-11-20 15:11:20.535293] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:41.870 [2024-11-20 15:11:20.535345] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:41.870 [2024-11-20 15:11:20.535365] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:42.127 NEW_FUNC[1/1]: 0x1fc00e8 in thread_execute_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:957 00:09:42.127 #25 NEW cov: 11177 ft: 14363 corp: 3/9b lim: 4 exec/s: 0 rss: 75Mb L: 4/4 MS: 4 ChangeByte-ChangeByte-ChangeBit-CrossOver- 00:09:42.127 [2024-11-20 15:11:20.724623] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:42.127 [2024-11-20 15:11:20.724650] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:42.127 [2024-11-20 15:11:20.724669] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:42.385 NEW_FUNC[1/1]: 0x1c290f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:42.385 #29 NEW cov: 11194 ft: 15245 corp: 4/13b lim: 4 exec/s: 0 rss: 75Mb L: 4/4 MS: 4 InsertByte-CrossOver-ChangeBit-InsertByte- 00:09:42.385 [2024-11-20 15:11:20.900354] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:42.385 [2024-11-20 15:11:20.900389] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:42.385 [2024-11-20 15:11:20.900422] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:42.385 #30 NEW cov: 11197 ft: 15745 corp: 5/17b lim: 4 exec/s: 30 rss: 75Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:42.385 [2024-11-20 15:11:21.068807] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:42.385 [2024-11-20 15:11:21.068830] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:42.385 [2024-11-20 15:11:21.068848] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:42.642 #31 NEW cov: 11197 ft: 16047 corp: 6/21b lim: 4 exec/s: 31 rss: 75Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:42.643 [2024-11-20 15:11:21.235870] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:42.643 [2024-11-20 15:11:21.235892] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:42.643 [2024-11-20 15:11:21.235910] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:42.900 #33 NEW cov: 11197 ft: 17077 corp: 7/25b lim: 4 exec/s: 33 rss: 76Mb L: 4/4 MS: 2 EraseBytes-InsertByte- 00:09:42.900 [2024-11-20 15:11:21.413865] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:42.900 [2024-11-20 15:11:21.413889] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:42.900 [2024-11-20 15:11:21.413907] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:42.900 #34 NEW cov: 11197 ft: 17148 corp: 8/29b lim: 4 exec/s: 34 rss: 76Mb L: 4/4 MS: 1 ChangeByte- 00:09:42.900 [2024-11-20 15:11:21.580992] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:42.900 [2024-11-20 15:11:21.581014] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:42.900 [2024-11-20 15:11:21.581031] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.157 #40 NEW cov: 11197 ft: 17267 corp: 9/33b lim: 4 exec/s: 40 rss: 76Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:43.157 [2024-11-20 15:11:21.750959] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.157 [2024-11-20 15:11:21.750982] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.157 [2024-11-20 15:11:21.750999] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.415 #41 NEW cov: 11204 ft: 17688 corp: 10/37b lim: 4 exec/s: 41 rss: 76Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:43.415 [2024-11-20 15:11:21.916022] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.415 [2024-11-20 15:11:21.916045] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.415 [2024-11-20 15:11:21.916062] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.415 #48 NEW cov: 11204 ft: 17858 corp: 11/41b lim: 4 exec/s: 24 rss: 76Mb L: 4/4 MS: 2 InsertByte-CrossOver- 00:09:43.415 #48 DONE cov: 11204 ft: 17858 corp: 11/41b lim: 4 exec/s: 24 rss: 76Mb 00:09:43.415 Done 48 runs in 2 second(s) 00:09:43.415 [2024-11-20 15:11:22.032522] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:43.673 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:43.673 15:11:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:43.673 [2024-11-20 15:11:22.318864] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:43.673 [2024-11-20 15:11:22.318945] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1482732 ] 00:09:43.931 [2024-11-20 15:11:22.415986] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:43.931 [2024-11-20 15:11:22.442132] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:44.188 INFO: Running with entropic power schedule (0xFF, 100). 00:09:44.188 INFO: Seed: 2209427917 00:09:44.188 INFO: Loaded 1 modules (385630 inline 8-bit counters): 385630 [0x2aa180c, 0x2affa6a), 00:09:44.188 INFO: Loaded 1 PC tables (385630 PCs): 385630 [0x2affa70,0x30e2050), 00:09:44.188 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:44.188 INFO: A corpus is not provided, starting from an empty corpus 00:09:44.188 #2 INITED exec/s: 0 rss: 67Mb 00:09:44.188 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:44.188 This may also happen if the target rejected all inputs we tried so far 00:09:44.188 [2024-11-20 15:11:22.687274] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:09:44.188 [2024-11-20 15:11:22.736771] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:44.754 NEW_FUNC[1/673]: 0x459ff8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:09:44.754 NEW_FUNC[2/673]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:44.754 #4 NEW cov: 11149 ft: 11115 corp: 2/9b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 2 InsertRepeatedBytes-InsertByte- 00:09:44.754 [2024-11-20 15:11:23.224532] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:44.754 #10 NEW cov: 11163 ft: 14154 corp: 3/17b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 CopyPart- 00:09:44.754 [2024-11-20 15:11:23.400190] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:45.013 NEW_FUNC[1/1]: 0x1c290f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:45.013 #11 NEW cov: 11180 ft: 15486 corp: 4/25b lim: 8 exec/s: 0 rss: 75Mb L: 8/8 MS: 1 ChangeByte- 00:09:45.013 [2024-11-20 15:11:23.590218] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:45.013 #17 NEW cov: 11180 ft: 15996 corp: 5/33b lim: 8 exec/s: 17 rss: 75Mb L: 8/8 MS: 1 CopyPart- 00:09:45.271 [2024-11-20 15:11:23.769571] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:45.271 #23 NEW cov: 11180 ft: 16491 corp: 6/41b lim: 8 exec/s: 23 rss: 75Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:45.271 [2024-11-20 15:11:23.945950] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:45.529 #26 NEW cov: 11180 ft: 16810 corp: 7/49b lim: 8 exec/s: 26 rss: 76Mb L: 8/8 MS: 3 CrossOver-CopyPart-InsertByte- 00:09:45.529 [2024-11-20 15:11:24.125051] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:45.787 #27 NEW cov: 11180 ft: 17137 corp: 8/57b lim: 8 exec/s: 27 rss: 76Mb L: 8/8 MS: 1 ChangeByte- 00:09:45.787 [2024-11-20 15:11:24.303570] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:45.787 #28 NEW cov: 11180 ft: 17170 corp: 9/65b lim: 8 exec/s: 28 rss: 76Mb L: 8/8 MS: 1 CrossOver- 00:09:46.044 [2024-11-20 15:11:24.483007] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:46.044 #29 NEW cov: 11187 ft: 17902 corp: 10/73b lim: 8 exec/s: 29 rss: 76Mb L: 8/8 MS: 1 ChangeByte- 00:09:46.044 [2024-11-20 15:11:24.675756] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:46.302 #30 NEW cov: 11187 ft: 18199 corp: 11/81b lim: 8 exec/s: 15 rss: 76Mb L: 8/8 MS: 1 ChangeBit- 00:09:46.302 #30 DONE cov: 11187 ft: 18199 corp: 11/81b lim: 8 exec/s: 15 rss: 76Mb 00:09:46.302 Done 30 runs in 2 second(s) 00:09:46.302 [2024-11-20 15:11:24.811523] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:46.560 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:46.561 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:46.561 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:46.561 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:46.561 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:46.561 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:46.561 15:11:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:46.561 [2024-11-20 15:11:25.098520] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:46.561 [2024-11-20 15:11:25.098598] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1483136 ] 00:09:46.561 [2024-11-20 15:11:25.192390] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:46.561 [2024-11-20 15:11:25.218202] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:46.818 INFO: Running with entropic power schedule (0xFF, 100). 00:09:46.818 INFO: Seed: 692453777 00:09:46.818 INFO: Loaded 1 modules (385630 inline 8-bit counters): 385630 [0x2aa180c, 0x2affa6a), 00:09:46.818 INFO: Loaded 1 PC tables (385630 PCs): 385630 [0x2affa70,0x30e2050), 00:09:46.818 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:46.818 INFO: A corpus is not provided, starting from an empty corpus 00:09:46.818 #2 INITED exec/s: 0 rss: 67Mb 00:09:46.818 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:46.818 This may also happen if the target rejected all inputs we tried so far 00:09:46.818 [2024-11-20 15:11:25.465140] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:47.333 NEW_FUNC[1/673]: 0x45a6e8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:47.333 NEW_FUNC[2/673]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:47.333 #221 NEW cov: 11137 ft: 11125 corp: 2/33b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 4 InsertByte-CrossOver-InsertRepeatedBytes-InsertRepeatedBytes- 00:09:47.591 #227 NEW cov: 11168 ft: 13378 corp: 3/65b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:09:47.849 NEW_FUNC[1/1]: 0x1c290f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:47.849 #238 NEW cov: 11185 ft: 13682 corp: 4/97b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:47.849 #239 NEW cov: 11185 ft: 15372 corp: 5/129b lim: 32 exec/s: 239 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:48.107 #245 NEW cov: 11185 ft: 15508 corp: 6/161b lim: 32 exec/s: 245 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:48.364 #246 NEW cov: 11185 ft: 15980 corp: 7/193b lim: 32 exec/s: 246 rss: 75Mb L: 32/32 MS: 1 CMP- DE: "\015A\243\004\000\000\000\000"- 00:09:48.364 #247 NEW cov: 11185 ft: 16627 corp: 8/225b lim: 32 exec/s: 247 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:48.621 #248 NEW cov: 11185 ft: 16658 corp: 9/257b lim: 32 exec/s: 248 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:09:48.879 #249 NEW cov: 11192 ft: 16752 corp: 10/289b lim: 32 exec/s: 249 rss: 76Mb L: 32/32 MS: 1 CopyPart- 00:09:48.879 #250 NEW cov: 11192 ft: 16793 corp: 11/321b lim: 32 exec/s: 125 rss: 76Mb L: 32/32 MS: 1 ChangeByte- 00:09:48.879 #250 DONE cov: 11192 ft: 16793 corp: 11/321b lim: 32 exec/s: 125 rss: 76Mb 00:09:48.879 ###### Recommended dictionary. ###### 00:09:48.879 "\015A\243\004\000\000\000\000" # Uses: 0 00:09:48.879 ###### End of recommended dictionary. ###### 00:09:48.879 Done 250 runs in 2 second(s) 00:09:49.138 [2024-11-20 15:11:27.575527] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:49.138 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:49.397 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:49.398 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:49.398 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:49.398 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:49.398 15:11:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:49.398 [2024-11-20 15:11:27.860473] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:49.398 [2024-11-20 15:11:27.860572] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1483504 ] 00:09:49.398 [2024-11-20 15:11:27.956379] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:49.398 [2024-11-20 15:11:27.981337] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:49.656 INFO: Running with entropic power schedule (0xFF, 100). 00:09:49.656 INFO: Seed: 3445469994 00:09:49.656 INFO: Loaded 1 modules (385630 inline 8-bit counters): 385630 [0x2aa180c, 0x2affa6a), 00:09:49.656 INFO: Loaded 1 PC tables (385630 PCs): 385630 [0x2affa70,0x30e2050), 00:09:49.656 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:49.656 INFO: A corpus is not provided, starting from an empty corpus 00:09:49.656 #2 INITED exec/s: 0 rss: 68Mb 00:09:49.656 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:49.656 This may also happen if the target rejected all inputs we tried so far 00:09:49.656 [2024-11-20 15:11:28.217151] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:50.171 NEW_FUNC[1/673]: 0x45af68 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:50.171 NEW_FUNC[2/673]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:50.171 #22 NEW cov: 11159 ft: 11096 corp: 2/33b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 5 ShuffleBytes-InsertRepeatedBytes-CrossOver-CopyPart-CopyPart- 00:09:50.171 #33 NEW cov: 11173 ft: 13449 corp: 3/65b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:09:50.428 NEW_FUNC[1/1]: 0x1c290f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:50.428 #44 NEW cov: 11190 ft: 14413 corp: 4/97b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:50.684 #50 NEW cov: 11190 ft: 15231 corp: 5/129b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:50.684 #56 NEW cov: 11190 ft: 15789 corp: 6/161b lim: 32 exec/s: 56 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:09:50.941 #57 NEW cov: 11190 ft: 15919 corp: 7/193b lim: 32 exec/s: 57 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:09:51.199 #58 NEW cov: 11190 ft: 16535 corp: 8/225b lim: 32 exec/s: 58 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:51.199 #59 NEW cov: 11190 ft: 16562 corp: 9/257b lim: 32 exec/s: 59 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:09:51.457 #60 NEW cov: 11197 ft: 16598 corp: 10/289b lim: 32 exec/s: 60 rss: 76Mb L: 32/32 MS: 1 CrossOver- 00:09:51.714 #61 NEW cov: 11197 ft: 17277 corp: 11/321b lim: 32 exec/s: 30 rss: 76Mb L: 32/32 MS: 1 CopyPart- 00:09:51.714 #61 DONE cov: 11197 ft: 17277 corp: 11/321b lim: 32 exec/s: 30 rss: 76Mb 00:09:51.714 Done 61 runs in 2 second(s) 00:09:51.714 [2024-11-20 15:11:30.217539] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:51.973 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:51.974 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:51.974 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:51.974 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:51.974 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:51.974 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:51.974 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:51.974 15:11:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:51.974 [2024-11-20 15:11:30.506304] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:51.974 [2024-11-20 15:11:30.506389] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1483860 ] 00:09:51.974 [2024-11-20 15:11:30.602595] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:51.974 [2024-11-20 15:11:30.627507] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.233 INFO: Running with entropic power schedule (0xFF, 100). 00:09:52.233 INFO: Seed: 1792516007 00:09:52.233 INFO: Loaded 1 modules (385630 inline 8-bit counters): 385630 [0x2aa180c, 0x2affa6a), 00:09:52.233 INFO: Loaded 1 PC tables (385630 PCs): 385630 [0x2affa70,0x30e2050), 00:09:52.233 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:52.233 INFO: A corpus is not provided, starting from an empty corpus 00:09:52.233 #2 INITED exec/s: 0 rss: 68Mb 00:09:52.233 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:52.233 This may also happen if the target rejected all inputs we tried so far 00:09:52.233 [2024-11-20 15:11:30.859301] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:52.233 [2024-11-20 15:11:30.912349] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:52.233 [2024-11-20 15:11:30.912383] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:52.750 NEW_FUNC[1/674]: 0x45b968 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:52.750 NEW_FUNC[2/674]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:52.750 #11 NEW cov: 11163 ft: 11135 corp: 2/14b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 4 ChangeByte-InsertRepeatedBytes-ChangeBinInt-CopyPart- 00:09:52.750 [2024-11-20 15:11:31.402704] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:52.750 [2024-11-20 15:11:31.402753] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:53.010 #12 NEW cov: 11181 ft: 14784 corp: 3/27b lim: 13 exec/s: 0 rss: 75Mb L: 13/13 MS: 1 CrossOver- 00:09:53.010 [2024-11-20 15:11:31.595064] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:53.010 [2024-11-20 15:11:31.595099] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:53.268 NEW_FUNC[1/1]: 0x1c290f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:53.268 #13 NEW cov: 11198 ft: 15944 corp: 4/40b lim: 13 exec/s: 0 rss: 76Mb L: 13/13 MS: 1 CrossOver- 00:09:53.268 [2024-11-20 15:11:31.787804] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:53.269 [2024-11-20 15:11:31.787837] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:53.269 #14 NEW cov: 11198 ft: 16383 corp: 5/53b lim: 13 exec/s: 14 rss: 76Mb L: 13/13 MS: 1 CopyPart- 00:09:53.527 [2024-11-20 15:11:31.978433] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:53.527 [2024-11-20 15:11:31.978463] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:53.527 #19 NEW cov: 11198 ft: 16647 corp: 6/66b lim: 13 exec/s: 19 rss: 76Mb L: 13/13 MS: 5 ChangeBinInt-CopyPart-ChangeByte-CrossOver-InsertRepeatedBytes- 00:09:53.527 [2024-11-20 15:11:32.175083] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:53.527 [2024-11-20 15:11:32.175113] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:53.786 #20 NEW cov: 11198 ft: 16834 corp: 7/79b lim: 13 exec/s: 20 rss: 76Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:53.786 [2024-11-20 15:11:32.359406] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:53.786 [2024-11-20 15:11:32.359436] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.045 #26 NEW cov: 11198 ft: 17303 corp: 8/92b lim: 13 exec/s: 26 rss: 76Mb L: 13/13 MS: 1 ChangeByte- 00:09:54.045 [2024-11-20 15:11:32.553625] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.045 [2024-11-20 15:11:32.553655] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.045 #27 NEW cov: 11205 ft: 17332 corp: 9/105b lim: 13 exec/s: 27 rss: 76Mb L: 13/13 MS: 1 CopyPart- 00:09:54.304 [2024-11-20 15:11:32.742185] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.304 [2024-11-20 15:11:32.742216] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.304 #31 NEW cov: 11205 ft: 17436 corp: 10/118b lim: 13 exec/s: 15 rss: 76Mb L: 13/13 MS: 4 ChangeBinInt-InsertRepeatedBytes-CrossOver-CopyPart- 00:09:54.304 #31 DONE cov: 11205 ft: 17436 corp: 10/118b lim: 13 exec/s: 15 rss: 76Mb 00:09:54.304 Done 31 runs in 2 second(s) 00:09:54.304 [2024-11-20 15:11:32.872531] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:54.563 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:54.563 15:11:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:54.563 [2024-11-20 15:11:33.157194] Starting SPDK v25.01-pre git sha1 557f022f6 / DPDK 23.11.0 initialization... 00:09:54.563 [2024-11-20 15:11:33.157267] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1484204 ] 00:09:54.822 [2024-11-20 15:11:33.252469] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.822 [2024-11-20 15:11:33.276594] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.822 INFO: Running with entropic power schedule (0xFF, 100). 00:09:54.822 INFO: Seed: 151523593 00:09:54.822 INFO: Loaded 1 modules (385630 inline 8-bit counters): 385630 [0x2aa180c, 0x2affa6a), 00:09:54.822 INFO: Loaded 1 PC tables (385630 PCs): 385630 [0x2affa70,0x30e2050), 00:09:54.822 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:54.822 INFO: A corpus is not provided, starting from an empty corpus 00:09:54.822 #2 INITED exec/s: 0 rss: 67Mb 00:09:54.822 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:54.822 This may also happen if the target rejected all inputs we tried so far 00:09:55.080 [2024-11-20 15:11:33.514188] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:55.080 [2024-11-20 15:11:33.566335] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:55.080 [2024-11-20 15:11:33.566369] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:55.340 NEW_FUNC[1/671]: 0x45c658 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:55.340 NEW_FUNC[2/671]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:55.340 #29 NEW cov: 11115 ft: 11120 corp: 2/10b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 2 ChangeBit-CMP- DE: "$\000\000\000\000\000\000\000"- 00:09:55.599 [2024-11-20 15:11:34.049734] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:55.599 [2024-11-20 15:11:34.049779] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:55.599 NEW_FUNC[1/3]: 0x15dc9f8 in map_one /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:734 00:09:55.599 NEW_FUNC[2/3]: 0x15e3268 in post_completion /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:1755 00:09:55.599 #35 NEW cov: 11174 ft: 14412 corp: 3/19b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 1 CrossOver- 00:09:55.599 [2024-11-20 15:11:34.235605] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:55.599 [2024-11-20 15:11:34.235640] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:55.858 NEW_FUNC[1/1]: 0x1c290f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:55.858 #36 NEW cov: 11191 ft: 14588 corp: 4/28b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:55.858 [2024-11-20 15:11:34.422258] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:55.858 [2024-11-20 15:11:34.422289] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:55.858 #38 NEW cov: 11191 ft: 15951 corp: 5/37b lim: 9 exec/s: 38 rss: 76Mb L: 9/9 MS: 2 EraseBytes-CopyPart- 00:09:56.116 [2024-11-20 15:11:34.607842] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.116 [2024-11-20 15:11:34.607873] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.116 #39 NEW cov: 11191 ft: 16373 corp: 6/46b lim: 9 exec/s: 39 rss: 76Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:56.116 [2024-11-20 15:11:34.779539] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.116 [2024-11-20 15:11:34.779568] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.375 #40 NEW cov: 11191 ft: 17165 corp: 7/55b lim: 9 exec/s: 40 rss: 76Mb L: 9/9 MS: 1 ChangeByte- 00:09:56.375 [2024-11-20 15:11:34.954427] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.375 [2024-11-20 15:11:34.954458] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.375 #41 NEW cov: 11191 ft: 17325 corp: 8/64b lim: 9 exec/s: 41 rss: 76Mb L: 9/9 MS: 1 CrossOver- 00:09:56.634 [2024-11-20 15:11:35.125835] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.634 [2024-11-20 15:11:35.125863] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.634 #42 NEW cov: 11191 ft: 17752 corp: 9/73b lim: 9 exec/s: 42 rss: 76Mb L: 9/9 MS: 1 CrossOver- 00:09:56.634 [2024-11-20 15:11:35.296417] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.634 [2024-11-20 15:11:35.296446] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.893 #43 NEW cov: 11198 ft: 17795 corp: 10/82b lim: 9 exec/s: 43 rss: 76Mb L: 9/9 MS: 1 ChangeByte- 00:09:56.893 [2024-11-20 15:11:35.467285] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.893 [2024-11-20 15:11:35.467317] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.893 #44 NEW cov: 11198 ft: 17815 corp: 11/91b lim: 9 exec/s: 22 rss: 76Mb L: 9/9 MS: 1 CopyPart- 00:09:56.893 #44 DONE cov: 11198 ft: 17815 corp: 11/91b lim: 9 exec/s: 22 rss: 76Mb 00:09:56.893 ###### Recommended dictionary. ###### 00:09:56.893 "$\000\000\000\000\000\000\000" # Uses: 0 00:09:56.893 ###### End of recommended dictionary. ###### 00:09:56.893 Done 44 runs in 2 second(s) 00:09:57.152 [2024-11-20 15:11:35.591526] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:57.152 15:11:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:57.152 15:11:35 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:57.152 15:11:35 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:57.152 15:11:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:57.152 00:09:57.152 real 0m19.383s 00:09:57.152 user 0m26.959s 00:09:57.152 sys 0m2.067s 00:09:57.152 15:11:35 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:57.152 15:11:35 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:57.152 ************************************ 00:09:57.152 END TEST vfio_llvm_fuzz 00:09:57.152 ************************************ 00:09:57.409 00:09:57.409 real 1m22.872s 00:09:57.409 user 2m6.452s 00:09:57.409 sys 0m9.784s 00:09:57.409 15:11:35 llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:57.409 15:11:35 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:57.409 ************************************ 00:09:57.409 END TEST llvm_fuzz 00:09:57.409 ************************************ 00:09:57.409 15:11:35 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:09:57.409 15:11:35 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:09:57.409 15:11:35 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:09:57.409 15:11:35 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:57.409 15:11:35 -- common/autotest_common.sh@10 -- # set +x 00:09:57.409 15:11:35 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:09:57.409 15:11:35 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:09:57.409 15:11:35 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:09:57.409 15:11:35 -- common/autotest_common.sh@10 -- # set +x 00:10:02.678 INFO: APP EXITING 00:10:02.678 INFO: killing all VMs 00:10:02.678 INFO: killing vhost app 00:10:02.678 INFO: EXIT DONE 00:10:05.206 Waiting for block devices as requested 00:10:05.206 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:10:05.206 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:05.206 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:05.206 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:05.206 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:05.464 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:05.464 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:05.464 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:05.724 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:05.724 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:05.724 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:05.983 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:05.983 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:05.983 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:06.251 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:06.251 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:06.251 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:12.809 Cleaning 00:10:12.809 Removing: /dev/shm/spdk_tgt_trace.pid1462471 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1459973 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1461098 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1462471 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1462968 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1464089 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1464271 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1465023 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1465038 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1465378 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1465753 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1465934 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1466142 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1466338 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1466537 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1466737 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1466967 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1467554 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1469936 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1470237 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1470434 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1470479 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1470869 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1470968 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1471423 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1471426 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1471636 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1471649 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1471850 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1471864 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1472313 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1472510 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1472703 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1472943 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1473422 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1473713 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1474076 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1474432 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1474783 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1475142 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1475498 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1475809 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1476076 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1476417 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1476768 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1477122 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1477483 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1477836 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1478190 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1478471 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1478754 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1479109 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1479482 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1479837 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1480190 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1480536 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1480812 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1481107 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1481457 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1482011 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1482376 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1482732 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1483136 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1483504 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1483860 00:10:12.809 Removing: /var/run/dpdk/spdk_pid1484204 00:10:12.809 Clean 00:10:12.809 15:11:50 -- common/autotest_common.sh@1453 -- # return 0 00:10:12.809 15:11:50 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:10:12.809 15:11:50 -- common/autotest_common.sh@732 -- # xtrace_disable 00:10:12.809 15:11:50 -- common/autotest_common.sh@10 -- # set +x 00:10:12.809 15:11:50 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:10:12.809 15:11:50 -- common/autotest_common.sh@732 -- # xtrace_disable 00:10:12.809 15:11:50 -- common/autotest_common.sh@10 -- # set +x 00:10:12.809 15:11:50 -- spdk/autotest.sh@392 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:12.809 15:11:50 -- spdk/autotest.sh@394 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:10:12.809 15:11:50 -- spdk/autotest.sh@394 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:10:12.809 15:11:50 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:10:12.809 15:11:50 -- spdk/autotest.sh@398 -- # hostname 00:10:12.809 15:11:50 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-39 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:10:12.809 geninfo: WARNING: invalid characters removed from testname! 00:10:16.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:10:21.372 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:10:23.277 15:12:01 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:31.421 15:12:09 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:36.685 15:12:15 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:43.243 15:12:20 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:47.425 15:12:26 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:53.978 15:12:31 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:59.244 15:12:36 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:59.244 15:12:36 -- spdk/autorun.sh@1 -- $ timing_finish 00:10:59.244 15:12:36 -- common/autotest_common.sh@738 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt ]] 00:10:59.244 15:12:36 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:59.244 15:12:36 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:59.244 15:12:36 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:59.244 + [[ -n 1334433 ]] 00:10:59.244 + sudo kill 1334433 00:10:59.252 [Pipeline] } 00:10:59.267 [Pipeline] // stage 00:10:59.272 [Pipeline] } 00:10:59.289 [Pipeline] // timeout 00:10:59.294 [Pipeline] } 00:10:59.309 [Pipeline] // catchError 00:10:59.315 [Pipeline] } 00:10:59.331 [Pipeline] // wrap 00:10:59.337 [Pipeline] } 00:10:59.351 [Pipeline] // catchError 00:10:59.360 [Pipeline] stage 00:10:59.363 [Pipeline] { (Epilogue) 00:10:59.376 [Pipeline] catchError 00:10:59.378 [Pipeline] { 00:10:59.391 [Pipeline] echo 00:10:59.393 Cleanup processes 00:10:59.399 [Pipeline] sh 00:10:59.683 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:59.683 1491843 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:59.695 [Pipeline] sh 00:10:59.975 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:59.976 ++ grep -v 'sudo pgrep' 00:10:59.976 ++ awk '{print $1}' 00:10:59.976 + sudo kill -9 00:10:59.976 + true 00:10:59.987 [Pipeline] sh 00:11:00.267 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:11:12.473 [Pipeline] sh 00:11:12.756 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:11:12.756 Artifacts sizes are good 00:11:12.771 [Pipeline] archiveArtifacts 00:11:12.779 Archiving artifacts 00:11:12.930 [Pipeline] sh 00:11:13.215 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:11:13.231 [Pipeline] cleanWs 00:11:13.241 [WS-CLEANUP] Deleting project workspace... 00:11:13.241 [WS-CLEANUP] Deferred wipeout is used... 00:11:13.247 [WS-CLEANUP] done 00:11:13.249 [Pipeline] } 00:11:13.268 [Pipeline] // catchError 00:11:13.281 [Pipeline] sh 00:11:13.558 + logger -p user.info -t JENKINS-CI 00:11:13.566 [Pipeline] } 00:11:13.583 [Pipeline] // stage 00:11:13.588 [Pipeline] } 00:11:13.605 [Pipeline] // node 00:11:13.611 [Pipeline] End of Pipeline 00:11:13.657 Finished: SUCCESS